相关文章推荐
活泼的足球
·
湖北咸宁:交通建设提速增效 ...
·
2 月前
·
忧郁的大象
·
克罗斯告别皇马主场,能否复制恩师齐达内的轨迹?
·
2 月前
·
一身肌肉的烤红薯
·
对象'DF__*‘依赖于列'*’-将int改 ...
·
4 月前
·
酷酷的煎鸡蛋
·
origin/src/main/java/o ...
·
4 月前
·
腹黑的企鹅
·
alert = ...
·
5 月前
·
小百科
›
Principal Component Analysis (PCA) Explained | Built In
压缩算法
covariance
爱旅游的铁链
8 月前
Skip to main content
</noscript><div class="dialog-off-canvas-main-canvas" data-off-canvas-main-canvas=""><div class="layout-container"><div class="header header-vue builtin-nuxt-app"><div class="region region-header"><div id="block-mainbuiltinnavigation" class="block block-bix-global block-bix-global-vue-navigation"><link href="https://frontend.builtin.com/3cb8c7ae696a9e1ec4537e9df7c387cdb10f8ef7/css/631db69.css" rel="stylesheet" crossorigin="anonymous"/><link href="https://frontend.builtin.com/3cb8c7ae696a9e1ec4537e9df7c387cdb10f8ef7/css/d251114.css" rel="stylesheet" crossorigin="anonymous"/><link rel="preload" crossorigin="anonymous" href="https://frontend.builtin.com/3cb8c7ae696a9e1ec4537e9df7c387cdb10f8ef7/ab29321.js" as="script"/><link rel="preload" crossorigin="anonymous" href="https://frontend.builtin.com/3cb8c7ae696a9e1ec4537e9df7c387cdb10f8ef7/css/631db69.css" as="style"/><link rel="preload" crossorigin="anonymous" href="https://frontend.builtin.com/3cb8c7ae696a9e1ec4537e9df7c387cdb10f8ef7/efd94e8.js" as="script"/><link rel="preload" crossorigin="anonymous" href="https://frontend.builtin.com/3cb8c7ae696a9e1ec4537e9df7c387cdb10f8ef7/css/d251114.css" as="style"/><link rel="preload" crossorigin="anonymous" href="https://frontend.builtin.com/3cb8c7ae696a9e1ec4537e9df7c387cdb10f8ef7/32e4848.js" as="script"/><div id="__nuxt"><style>.navigation-wrapper .ml-auto{margin-left:auto}.navigation-wrapper .vue-nav-container{max-width:1168px;margin-left:auto;margin-right:auto;padding:0 16px}.skeleton-box{display:inline-block;width:100%;border-radius:40px;height:1em;position:relative;overflow:hidden;background-color:#dddbdd}.skeleton-box::after{top:0;right:0;bottom:0;left:0;transform:translateX(-100%);background-image:linear-gradient(90deg,rgba(255,255,255,0) 0,rgba(255,255,255,.2) 20%,rgba(255,255,255,.5) 60%,rgba(255,255,255,0));animation:shimmer 2s infinite;content:''}.sub-navigation-national{;background:#fff;box-sizing:border-box;border-bottom:1px solid #ddd}body .sub-navigation-national{display:block}@media screen and (max-width:1023px){body .sub-navigation-national{}}.sub-navigation-national .vue-nav-container{display:flex;min-height:47px;align-items:center}.sub-navigation-national .skeleton-box{max-width:140px;margin-right:10px}.sub-navigation-national .skeleton-box:last-child{margin-left:auto;margin-right:0}.main-navigation{background:#04003f;min-height:80px;display:flex;align-items:center}.main-navigation .vue-nav-container{display:flex;align-items:center;width:100%}.main-navigation .skeleton-box{height:2.5rem}@media screen and (max-width:767px){.main-navigation{min-height:57px}.main-navigation .skeleton-box{height:1.5rem}}@media screen and (max-width:1023px) and (min-width:768px){body .main-navigation{min-height:57px}body .main-navigation .skeleton-box{height:2rem}}.main-navigation .skeleton-login-link{margin-left:auto;max-width:150px}.main-navigation .skeleton-logo{margin-right:auto;max-width:170px}.learn-lab-banner{;background:#28386f;padding:9px 16px 8px 16px}body .learn-lab-banner{display:block}.learn-lab-banner .vue-nav-container{display:flex}.learn-lab-banner .skeleton-box{height:.875rem;box-sizing:border-box}@keyframes shimmer{100%{transform:translateX(100%)}}</style><div class="navigation-wrapper navigation-wrapper-js"><div class="main-navigation"><div class="vue-nav-container"><div class="skeleton-box skeleton-logo"/><div class="skeleton-box skeleton-login-link"/></div></div><div class="sub-navigation-national"><div class="vue-nav-container"><div class="skeleton-box"/><div class="skeleton-box"/><div class="skeleton-box"/><div class="skeleton-box"/><div class="skeleton-box"/></div></div></div></div></div></div></div><div class="region region-help"><div data-drupal-messages-fallback="" class="hidden"/></div><div class="region region-featured-top"><div class="region-inner"><div id="block-blognationalheader" class="block block-bix-blogs block-bix-blogs-header-national"><div class="block-content with-hero-image with-byline-background"><div class="header-left"/><div class="background"><div class="header-right"><div class="tags"><div class="tags-inner"><span class="tag-name"><a href="https://builtin.com/data-science" target="_blank">Data Science</a></span></div><div class="tags-inner"><span class="tag-name"><a href="https://builtin.com/expert-contributors" target="_blank">Expert Contributors</a></span></div><div class="tags-inner-mobile"><span class="tag-name"><a href="https://builtin.com/data-science" target="_blank">Data Science</a></span></div><div class="tag-inner-mobile-counter"><span class="tag-name"><span class="topic-click">+1</span></span></div></div><h1 class="title">A Step-by-Step Explanation of Principal Component Analysis (PCA)</h1><div class="subtitle">Learn how to use a PCA when working with large data sets.</div><div class="written-by-container-full written-by-container"><div class="written-info empty-date"><div class="writtenby_name"><span class="written_by_text">Written by</span><span class="written_by_name"><a href="/authors/zakaria-jaadi" hreflang="en">Zakaria Jaadi</a></span></div><div class="written-by-popup-box-full written-by-popup-box"><div class="top"><div class="writtenby_name"><span class="written_by_name"><a href="/authors/zakaria-jaadi" hreflang="en">Zakaria Jaadi</a></span><div class="written_by_title">Data Scientist at <span class="company">HP</span></div></div><a href="/authors/zakaria-jaadi"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 448 512" class="svg-inline--fa" role="img" aria-hidden="true" style="color:currentColor"><path d="M438.6 278.6c12.5-12.5 12.5-32.8 0-45.3l-160-160c-12.5-12.5-32.8-12.5-45.3 0s-12.5 32.8 0 45.3L338.8 224 32 224c-17.7 0-32 14.3-32 32s14.3 32 32 32l306.7 0L233.4 393.4c-12.5 12.5-12.5 32.8 0 45.3s32.8 12.5 45.3 0l160-160z" fill="currentColor"/></svg></a></div><span class="border"/><span class="description">Zakaria Jaadi is a data scientist and machine learning engineer. <span class="caret-down"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 320 512" class="svg-inline--fa" role="img" aria-hidden="true" style="color:currentColor"><path d="M137.4 374.6c12.5 12.5 32.8 12.5 45.3 0l128-128c9.2-9.2 11.9-22.9 6.9-34.9s-16.6-19.8-29.6-19.8L32 192c-12.9 0-24.6 7.8-29.6 19.8s-2.2 25.7 6.9 34.9l128 128z" fill="currentColor"/></svg></span></span></div></div></div></div></div></div><div class="image-byline-wrapper"><div class="blog-image"><img class="b-lazy" src="https://builtin.com/cdn-cgi/image/f=auto,quality=80,width=752,height=435/https://builtin.com/sites/www.builtin.com/files/styles/byline_image/public/2022-09/principal-component-analysis.jpg" srcset="https://builtin.com/cdn-cgi/image/f=auto,quality=80,width=320,height=185/https://builtin.com/sites/www.builtin.com/files/styles/byline_image/public/2022-09/principal-component-analysis.jpg 400w, https://builtin.com/cdn-cgi/image/f=auto,quality=80,width=500,height=290/https://builtin.com/sites/www.builtin.com/files/styles/byline_image/public/2022-09/principal-component-analysis.jpg 600w, https://builtin.com/cdn-cgi/image/f=auto,quality=80,width=752,height=435/https://builtin.com/sites/www.builtin.com/files/styles/byline_image/public/2022-09/principal-component-analysis.jpg 752w" height="435" sizes="100vw" alt="principal component analysis written in black text with blue overlay" title="Image: Shutterstock / Built In"/> <a href="/expert-contributors"><img src="/profiles/builtin/themes/bix/assets/expert-contrib-badge.svg"/></a><div class="image-caption">Image: Shutterstock / Built In</div></div></div></div></div></div> <main id="page-main-content" role="main"> <a id="main-content" tabindex="-1"/><div class="layout-content"><div class="region region-content"><div class="l-three-columns container"><div class="l-main-container clearfix display-flex"><div class="l-content"><div class="row row-region-middle"><div class="row-inside"><div class="block-region-middle"><div class="block block-ctools block-entity-viewnode"><div class="updated-wrapper"><div class="updated-by-popup" id="updated-by-popup"><div class="popup-wrapper"><div class="name-wrapper"><div class="name"><a href="/authors/brennan-whitfield">Brennan Whitfield</a></div><div class="title">Staff Reporter, Updates at <span class="company">Built In</span></div></div><a href="/authors/brennan-whitfield"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 448 512" class="svg-inline--fa" role="img" aria-hidden="true" style="color:currentColor"><path d="M438.6 278.6c12.5-12.5 12.5-32.8 0-45.3l-160-160c-12.5-12.5-32.8-12.5-45.3 0s-12.5 32.8 0 45.3L338.8 224 32 224c-17.7 0-32 14.3-32 32s14.3 32 32 32l306.7 0L233.4 393.4c-12.5 12.5-12.5 32.8 0 45.3s32.8 12.5 45.3 0l160-160z" fill="currentColor"/></svg></a></div><span class="caret-down"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 320 512" class="svg-inline--fa" role="img" aria-hidden="true" style="color:currentColor"><path d="M137.4 374.6c12.5 12.5 32.8 12.5 45.3 0l128-128c9.2-9.2 11.9-22.9 6.9-34.9s-16.6-19.8-29.6-19.8L32 192c-12.9 0-24.6 7.8-29.6 19.8s-2.2 25.7 6.9 34.9l128 128z" fill="currentColor"/></svg></span></div><div class="updater-name"><span class="text">UPDATED BY</span><div class="name updated-by-name"><span class="name-value"><a href="/authors/brennan-whitfield">Brennan Whitfield</a></span><span class="date"> | Mar 29, 2023</span></div></div><div class="reviewed-by-popup" id="reviewed-by-popup"><div class="popup-wrapper"><div class="name-wrapper"><div class="name"><a href="/authors/sadrach-pierre">Sadrach Pierre</a></div><div class="title">Senior Data Scientist at <span class="company">a hedge fund based in New York City</span></div></div><a href="/authors/sadrach-pierre"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 448 512" class="svg-inline--fa" role="img" aria-hidden="true" style="color:currentColor"><path d="M438.6 278.6c12.5-12.5 12.5-32.8 0-45.3l-160-160c-12.5-12.5-32.8-12.5-45.3 0s-12.5 32.8 0 45.3L338.8 224 32 224c-17.7 0-32 14.3-32 32s14.3 32 32 32l306.7 0L233.4 393.4c-12.5 12.5-12.5 32.8 0 45.3s32.8 12.5 45.3 0l160-160z" fill="currentColor"/></svg></a></div><span class="caret-down"><svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 320 512" class="svg-inline--fa" role="img" aria-hidden="true" style="color:currentColor"><path d="M137.4 374.6c12.5 12.5 32.8 12.5 45.3 0l128-128c9.2-9.2 11.9-22.9 6.9-34.9s-16.6-19.8-29.6-19.8L32 192c-12.9 0-24.6 7.8-29.6 19.8s-2.2 25.7 6.9 34.9l128 128z" fill="currentColor"/></svg></span></div><div class="updater-name reviewer-name-box"><span class="text">REVIEWED BY</span><div class="name reviewed-by-name"><span class="name-value"><a href="/authors/sadrach-pierre">Sadrach Pierre</a></span></div></div></div> <article class="blog-national"><div class="wrap-share-social"><div id="bix-share-social" class="bix-share-social-spotlight"><div class="a2a_kit a2a_kit_size_24 a2a_default_style my-items"> <my-item entity-type="article" entity-id="38313" variant="circle-button"/> <a class="a2a_button_linkedin bix_share_button" data-type="linkedin" target="_blank" rel="nofollow"> <svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 448 512" class="svg-inline--fa" role="img" aria-hidden="true" style="color:currentColor"><path d="M416 32H31.9C14.3 32 0 46.5 0 64.3v383.4C0 465.5 14.3 480 31.9 480H416c17.6 0 32-14.5 32-32.3V64.3c0-17.8-14.4-32.3-32-32.3zM135.4 416H69V202.2h66.5V416zm-33.2-243c-21.3 0-38.5-17.3-38.5-38.5S80.9 96 102.2 96c21.2 0 38.5 17.3 38.5 38.5 0 21.3-17.2 38.5-38.5 38.5zm282.1 243h-66.4V312c0-24.8-.5-56.7-34.5-56.7-34.6 0-39.9 27-39.9 54.9V416h-66.4V202.2h63.7v29.2h.9c8.9-16.8 30.6-34.5 62.9-34.5 67.2 0 79.7 44.3 79.7 101.9V416z" fill="currentColor"/></svg> </a> <a class="a2a_button_hacker_news bix_share_button" data-type="hackernews" target="_blank" rel="nofollow"> <svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 448 512" class="svg-inline--fa" role="img" aria-hidden="true" style="color:currentColor"><path d="M0 32v448h448V32H0zm21.2 197.2H21c.1-.1.2-.3.3-.4 0 .1 0 .3-.1.4zm218 53.9V384h-31.4V281.3L128 128h37.3c52.5 98.3 49.2 101.2 59.3 125.6 12.3-27 5.8-24.4 60.6-125.6H320l-80.8 155.1z" fill="currentColor"/></svg> </a> <a class="a2a_button_reddit bix_share_button" data-type="reddit" target="_blank" rel="nofollow"> <svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 512 512" class="svg-inline--fa" role="img" aria-hidden="true" style="color:currentColor"><path d="M440.3 203.5c-15 0-28.2 6.2-37.9 15.9-35.7-24.7-83.8-40.6-137.1-42.3L293 52.3l88.2 19.8c0 21.6 17.6 39.2 39.2 39.2 22 0 39.7-18.1 39.7-39.7s-17.6-39.7-39.7-39.7c-15.4 0-28.7 9.3-35.3 22l-97.4-21.6c-4.9-1.3-9.7 2.2-11 7.1L246.3 177c-52.9 2.2-100.5 18.1-136.3 42.8-9.7-10.1-23.4-16.3-38.4-16.3-55.6 0-73.8 74.6-22.9 100.1-1.8 7.9-2.6 16.3-2.6 24.7 0 83.8 94.4 151.7 210.3 151.7 116.4 0 210.8-67.9 210.8-151.7 0-8.4-.9-17.2-3.1-25.1 49.9-25.6 31.5-99.7-23.8-99.7zM129.4 308.9c0-22 17.6-39.7 39.7-39.7 21.6 0 39.2 17.6 39.2 39.7 0 21.6-17.6 39.2-39.2 39.2-22 .1-39.7-17.6-39.7-39.2zm214.3 93.5c-36.4 36.4-139.1 36.4-175.5 0-4-3.5-4-9.7 0-13.7 3.5-3.5 9.7-3.5 13.2 0 27.8 28.5 120 29 149 0 3.5-3.5 9.7-3.5 13.2 0 4.1 4 4.1 10.2.1 13.7zm-.8-54.2c-21.6 0-39.2-17.6-39.2-39.2 0-22 17.6-39.7 39.2-39.7 22 0 39.7 17.6 39.7 39.7-.1 21.5-17.7 39.2-39.7 39.2z" fill="currentColor"/></svg> </a> <a class="a2a_button_twitter bix_share_button" data-type="twitter" target="_blank" rel="nofollow"> <svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 512 512" class="svg-inline--fa" role="img" aria-hidden="true" style="color:currentColor"><path d="M459.37 151.716c.325 4.548.325 9.097.325 13.645 0 138.72-105.583 298.558-298.558 298.558-59.452 0-114.68-17.219-161.137-47.106 8.447.974 16.568 1.299 25.34 1.299 49.055 0 94.213-16.568 130.274-44.832-46.132-.975-84.792-31.188-98.112-72.772 6.498.974 12.995 1.624 19.818 1.624 9.421 0 18.843-1.3 27.614-3.573-48.081-9.747-84.143-51.98-84.143-102.985v-1.299c13.969 7.797 30.214 12.67 47.431 13.319-28.264-18.843-46.781-51.005-46.781-87.391 0-19.492 5.197-37.36 14.294-52.954 51.655 63.675 129.3 105.258 216.365 109.807-1.624-7.797-2.599-15.918-2.599-24.04 0-57.828 46.782-104.934 104.934-104.934 30.213 0 57.502 12.67 76.67 33.137 23.715-4.548 46.456-13.32 66.599-25.34-7.798 24.366-24.366 44.833-46.132 57.827 21.117-2.273 41.584-8.122 60.426-16.243-14.292 20.791-32.161 39.308-52.628 54.253z" fill="currentColor"/></svg> </a> <a class="a2a_button_facebook bix_share_button" data-type="facebook" target="_blank" rel="nofollow"> <svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 320 512" class="svg-inline--fa" role="img" aria-hidden="true" style="color:currentColor"><path d="M279.14 288l14.22-92.66h-88.91v-60.13c0-25.35 12.42-50.06 52.24-50.06h40.42V6.26S260.43 0 225.36 0c-73.22 0-121.08 44.38-121.08 124.72v70.62H22.89V288h81.39v224h100.17V288z" fill="currentColor"/></svg> </a></div></div></div><div class="node--content"><div class="clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item"><p>The purpose of this post is to provide a complete and simplified explanation of <a href="https://builtin.com/machine-learning/pca-in-python" target="_blank">principal component analysis (PCA)</a>. We’ll cover how it works step by step, so everyone can understand it and make use of it, even those without a strong mathematical background.</p><p>PCA is a widely covered <a href="https://builtin.com/machine-learning" target="_blank">machine learning</a> method on the web, and there are some great articles about it, but many spend too much time in the weeds on the topic, when most of us just want to know how it works in a simplified way. </p><p>Principal component analysis can be broken down into five steps. I’ll go through each step, providing logical explanations of what PCA is doing and simplifying mathematical concepts such as <a href="https://builtin.com/data-science/when-and-why-standardize-your-data" target="_blank">standardization</a>, <a href="https://builtin.com/data-science/covariance-vs-correlation" target="_blank">covariance</a>, eigenvectors and eigenvalues without focusing on how to compute them.</p><div class="snippet-box snippet-box-ordered"><div><h2 class="title">How Do You Do a Principal Component Analysis?</h2><div class="description"><ol><li>Standardize the range of continuous initial variables</li><li>Compute the covariance matrix to identify correlations</li><li>Compute the eigenvectors and eigenvalues of the covariance matrix to identify the principal components</li><li>Create a feature vector to decide which principal components to keep</li><li>Recast the data along the principal components axes</li></ol></div></div></div><p>First, some basic (and brief) background is necessary for context.</p><p> </p><div class="video-embed-field-provider-youtube video-embed-field-responsive-video"><iframe width="854" height="480" loading="lazy" frameborder="0" allowfullscreen="allowfullscreen" data-src="https://www.youtube.com/embed/FD4DeN81ODY?autoplay=0&start=0&rel=0" class="b-lazy"/><noscript><div class="player-unavailable"><div class="message">An error occurred.</div><div class="submessage">Unable to execute JavaScript. Try watching this video on <a href="https://www.youtube.com/embed/FD4DeN81ODY?autoplay=0&start=0&rel=0" target="_blank">www.youtube.com</a>, or enable JavaScript if it is disabled in your browser.</div></div></noscript></div><figcaption class="video-caption">An overview of principal component analysis (PCA). | Video: Visually Explained</figcaption><h2>What Is Principal Component Analysis?</h2><p>Principal component analysis, or PCA, is a <a href="https://builtin.com/data-science/dimensionality-reduction-python" target="_blank">dimensionality reduction</a> method that is often used to reduce the dimensionality of large <a href="https://builtin.com/data-science" target="_blank">data sets</a>, by transforming a large set of variables into a smaller one that still contains most of the information in the large set.</p><p>Reducing the number of variables of a data set naturally comes at the expense of accuracy, but the trick in dimensionality reduction is to trade a little accuracy for simplicity. Because smaller data sets are easier to explore and visualize and make analyzing data points much easier and faster for <a href="https://builtin.com/data-science/tour-top-10-algorithms-machine-learning-newbies" target="_blank">machine learning algorithms</a> without extraneous variables to process.</p><p>So, to sum up, the idea of PCA is simple — <strong>reduce the number of variables of a data set, while preserving as much information as possible.</strong></p><p> </p><h2>Step-by-Step Explanation of PCA</h2><h3 id="a7c3">Step 1: Standardization</h3><p>The aim of this step is to standardize the range of the continuous initial variables so that each one of them contributes equally to the analysis.</p><p>More specifically, the reason why it is critical to perform standardization prior to PCA, is that the latter is quite sensitive regarding the variances of the initial variables. That is, if there are large differences between the ranges of initial variables, those variables with larger ranges will dominate over those with small ranges (for example, a variable that ranges between 0 and 100 will dominate over a variable that ranges between 0 and 1), which will lead to biased results. So, transforming the data to comparable scales can prevent this problem.</p><p>Mathematically, this can be done by subtracting the mean and dividing by the standard deviation for each value of each variable.</p><img alt="Principal Component Analysis Standardization" data-entity-type="file" data-entity-uuid="5b8c9140-3d34-4e13-b0a1-2c8295289504" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="align-center b-lazy" width="263" height="54" loading="lazy" data-src="https://builtin.com/sites/www.builtin.com/files/styles/ckeditor_optimize/public/inline-images/national/Principal%2520Component%2520Analysis%2520Standardization.png"/><p>Once the standardization is done, all the variables will be transformed to the same scale.</p><p> </p><h3 id="e8b5">Step 2: Covariance Matrix computation</h3><p>The aim of this step is to understand how the variables of the input data set are varying from the mean with respect to each other, or in other words, to see if there is any relationship between them. Because sometimes, variables are highly correlated in such a way that they contain redundant information. So, in order to identify these correlations, we compute the <a href="https://builtin.com/data-science/mahalanobis-distance" target="_blank">covariance matrix</a>.</p><p>The covariance matrix is a <em>p</em> × <em>p</em><strong> </strong>symmetric matrix (where <em>p </em>is the number of dimensions) that has as entries the covariances associated with all possible pairs of the initial variables. For example, for a 3-dimensional data set with 3 variables <em>x</em>, <em>y</em>, and <em>z</em>, the covariance matrix is a 3×3 data matrix of this from:</p><figure><img alt="Covariance Matrix for 3-Dimensional Data" data-entity-type="file" data-entity-uuid="66c2e3f2-a873-4c28-944a-e42ce9da3bb9" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" width="406" height="94" loading="lazy" data-src="https://builtin.com/sites/www.builtin.com/files/styles/ckeditor_optimize/public/inline-images/national/Principal%2520Component%2520Analysis%2520Covariance%2520Matrix.png" class=" b-lazy"/><figcaption>Covariance Matrix for 3-Dimensional Data.</figcaption></figure><p>Since the covariance of a variable with itself is its variance (Cov(a,a)=Var(a)), in the main diagonal (Top left to bottom right) we actually have the variances of each initial variable. And since the covariance is commutative (Cov(a,b)=Cov(b,a)), the entries of the covariance matrix are symmetric with respect to the main diagonal, which means that the upper and the lower triangular portions are equal.</p><p><strong>What do the covariances that we have as entries of the matrix tell us about the correlations between the variables?</strong></p><p>It’s actually the sign of the covariance that matters:</p><ul><li>If positive then: the two variables increase or decrease together (correlated)</li><li>If negative then: one increases when the other decreases (Inversely correlated)</li></ul><p>Now that we know that the covariance matrix is not more than a table that summarizes the correlations between all the possible pairs of variables, let’s move to the next step.</p><p> </p><h3 id="2c76">Step 3: Compute the eigenvectors and eigenvalues of the covariance matrix to identify the principal components</h3><p>Eigenvectors and eigenvalues are the <a href="https://builtin.com/data-science/basic-linear-algebra-deep-learning" target="_blank">linear algebra</a> concepts that we need to compute from the covariance matrix in order to determine the <strong><em>principal components</em></strong> of the data. Before getting to the explanation of these concepts, let’s first understand what do we mean by principal components.</p><p>Principal components are new variables that are constructed as linear combinations or mixtures of the initial variables. These combinations are done in such a way that the new variables (i.e., principal components) are uncorrelated and most of the information within the initial variables is squeezed or compressed into the first components. So, the idea is 10-dimensional data gives you 10 principal components, but PCA tries to put maximum possible information in the first component, then maximum remaining information in the second and so on, until having something like shown in the scree plot below.</p><figure><img alt="Percentage of Variance (Information) for each by PC" data-entity-type="file" data-entity-uuid="0187d170-861a-43a4-943f-2b8bbbb38317" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" width="700" height="483" loading="lazy" data-src="https://builtin.com/sites/www.builtin.com/files/styles/ckeditor_optimize/public/inline-images/national/Principal%2520Component%2520Analysis%2520Principal%2520Components.png" class=" b-lazy"/><figcaption>Percentage of Variance (Information) for each by PC.</figcaption></figure><p>Organizing information in principal components this way, will allow you to reduce dimensionality without losing much information, and this by discarding the components with low information and considering the remaining components as your new variables.</p><p>An important thing to realize here is that the principal components are less interpretable and don’t have any real meaning since they are constructed as linear combinations of the initial variables.</p><p>Geometrically speaking, principal components represent the directions of the data that explain a <strong>maximal amount of variance</strong>, that is to say, the lines that capture most information of the data. The relationship between variance and information here, is that, the larger the variance carried by a line, the larger the dispersion of the data points along it, and the larger the dispersion along a line, the more information it has. To put all this simply, just think of principal components as new axes that provide the best angle to see and evaluate the data, so that the differences between the observations are better visible.</p><p class="bix-embed-read-more"><span><span>Hiring Now</span><a href="https://builtin.com/jobs/remote/data-analytics/data-science" target="_blank">View All Remote Data Science Jobs</a></span></p><p> </p><h2 id="69ca">How PCA Constructs the Principal Components</h2><p>As there are as many principal components as there are variables in the data, principal components are constructed in such a manner that the first principal component accounts for the <strong>largest possible variance</strong> in the data set. For example, let’s assume that the scatter plot of our data set is as shown below, can we guess the first principal component ? Yes, it’s approximately the line that matches the purple marks because it goes through the origin and it’s the line in which the projection of the points (red dots) is the most spread out. Or mathematically speaking, it’s the line that maximizes the variance (the average of the squared distances from the projected points (red dots) to the origin).</p><img alt="Principal Component Analysis second principal" data-entity-type="file" data-entity-uuid="9a6e0df7-e940-4bfa-93d0-36dc25cd612b" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="align-center b-lazy" width="1000" height="400" data-src="/sites/www.builtin.com/files/inline-images/national/Principal%2520Component%2520Analysis%2520second%2520principal.gif"/><p>The second principal component is calculated in the same way, with the condition that it is uncorrelated with (i.e., perpendicular to) the first principal component and that it accounts for the next highest variance.</p><p>This continues until a total of p principal components have been calculated, equal to the original number of variables.</p><p>Now that we understand what we mean by principal components, let’s go back to eigenvectors and eigenvalues. What you first need to know about them is that they always come in pairs, so that every eigenvector has an eigenvalue. And their number is equal to the number of dimensions of the data. For example, for a 3-dimensional data set, there are 3 variables, therefore there are 3 eigenvectors with 3 corresponding eigenvalues.</p><p>Without further ado, it is eigenvectors and eigenvalues who are behind all the magic explained above, because the eigenvectors of the Covariance matrix are actually <em>the</em><strong><em> </em></strong><em>directions of the axes where there is the most variance </em>(most information) and that we call Principal Components. And eigenvalues are simply the coefficients attached to eigenvectors, which give the <em>amount of variance carried in each Principal Component</em>.</p><p>By ranking your eigenvectors in order of their eigenvalues, highest to lowest, you get the principal components in order of significance.</p><p><strong>Principal Component Analysis Example:</strong></p><p>Let’s suppose that our data set is 2-dimensional with 2 variables <strong><em>x,y </em></strong>and that the eigenvectors and eigenvalues of the covariance matrix are as follows:</p><img alt="Principal Component Analysis Example" data-entity-type="file" data-entity-uuid="90015443-2a79-436b-bab4-5f486f982e80" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="align-center b-lazy" width="466" height="159" loading="lazy" data-src="https://builtin.com/sites/www.builtin.com/files/styles/ckeditor_optimize/public/inline-images/national/Principal%2520Component%2520Analysis%2520Example.png"/><p>If we rank the eigenvalues in descending order, we get λ1>λ2, which means that the eigenvector that corresponds to the first principal component (PC1) is <em>v1 </em>and the one that corresponds to the second principal component (PC2) is<em>v2.</em></p><p>After having the principal components, to compute the percentage of variance (information) accounted for by each component, we divide the eigenvalue of each component by the sum of eigenvalues. If we apply this on the example above, we find that PC1 and PC2 carry respectively 96 percent and 4 percent of the variance of the data.</p><p> </p><h3 id="d368">Step 4: Feature Vector</h3><p>As we saw in the previous step, computing the eigenvectors and ordering them by their eigenvalues in descending order, allow us to find the principal components in order of significance. In this step, what we do is, to choose whether to keep all these components or discard those of lesser significance (of low eigenvalues), and form with the remaining ones a matrix of vectors that we call <em>Feature vector</em>.</p><p>So, the <a href="https://builtin.com/machine-learning/siamese-network" target="_blank">feature vector</a> is simply a matrix that has as columns the eigenvectors of the components that we decide to keep. This makes it the first step towards dimensionality reduction, because if we choose to keep only <strong><em>p</em></strong> eigenvectors (components) out of <strong><em>n</em></strong>, the final data set will have only <strong><em>p</em></strong> dimensions.</p><p><strong>Principal Component Analysis Example</strong>:</p><p>Continuing with the example from the previous step, we can either form a feature vector with both of the eigenvectors <em>v</em>1 and <em>v</em>2:</p><img alt="Principal Component Analysis eigen vectors" data-entity-type="file" data-entity-uuid="42024a6f-f19e-48e3-b8b9-a9e24572be33" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="align-center b-lazy" width="308" height="63" loading="lazy" data-src="https://builtin.com/sites/www.builtin.com/files/styles/ckeditor_optimize/public/inline-images/national/Principal%2520Component%2520Analysis%2520eigen%2520vectos.png"/><p>Or discard the eigenvector <em>v</em>2, which is the one of lesser significance, and form a feature vector with <em>v</em>1 only:</p><img alt="Principal Component Analysis eigen vectors 2" data-entity-type="file" data-entity-uuid="12d3f7d1-805e-4fb9-96a9-4f2dc4cc55ac" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="align-center b-lazy" width="154" height="63" loading="lazy" data-src="https://builtin.com/sites/www.builtin.com/files/styles/ckeditor_optimize/public/inline-images/national/Principal%2520Component%2520Analysis%2520eigen%2520vectors%25202.png"/><p>Discarding the eigenvector <em>v2 </em>will reduce dimensionality by 1, and will consequently cause a loss of information in the final data set. But given that <em>v</em>2 was carrying only 4 percent of the information, the loss will be therefore not important and we will still have 96 percent of the information that is carried by <em>v</em>1.</p><hr/><p>So, as we saw in the example, it’s up to you to choose whether to keep all the components or discard the ones of lesser significance, depending on what you are looking for. Because if you just want to describe your data in terms of new variables (principal components) that are uncorrelated without seeking to reduce dimensionality, leaving out lesser significant components is not needed.</p><p> </p><h3 id="498a">Step 5: Recast the Data Along the Principal Components Axes</h3><p>In the previous steps, apart from standardization, you do not make any changes on the data, you just select the principal components and form the feature vector, but the input data set remains always in terms of the original axes (i.e, in terms of the initial variables).</p><p>In this step, which is the last one, the aim is to use the feature vector formed using the eigenvectors of the covariance matrix, to reorient the data from the original axes to the ones represented by the principal components (hence the name Principal Components Analysis). This can be done by multiplying the transpose of the original data set by the transpose of the feature vector.</p><img alt="Principal Component Analysis feature vector" data-entity-type="file" data-entity-uuid="229bfcf3-0a34-4504-b9c3-f498f739637e" src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="align-center b-lazy" width="700" height="27" loading="lazy" data-src="https://builtin.com/sites/www.builtin.com/files/styles/ckeditor_optimize/public/inline-images/national/Principal%2520Component%2520Analysis%2520feature%2520vector.png"/><p> </p><p><strong>References</strong>:</p><ul><li>[Steven M. Holland, Univ. of Georgia]: Principal Components Analysis</li><li>[skymind.ai]: Eigenvectors, Eigenvalues, PCA, Covariance and Entropy</li><li>[Lindsay I. Smith]: A tutorial on Principal Component Analysis</li></ul></div><div class="wrap-share-social-fixed"><div id="bix-share-social" class="bix-share-social-spotlight"><div class="a2a_kit a2a_kit_size_24 a2a_default_style my-items"> <my-item entity-type="article" entity-id="38313" variant="circle-button"/> <a class="a2a_button_linkedin bix_share_button" data-type="linkedin" target="_blank" rel="nofollow"> <svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 448 512" class="svg-inline--fa" role="img" aria-hidden="true" style="color:currentColor"><path d="M416 32H31.9C14.3 32 0 46.5 0 64.3v383.4C0 465.5 14.3 480 31.9 480H416c17.6 0 32-14.5 32-32.3V64.3c0-17.8-14.4-32.3-32-32.3zM135.4 416H69V202.2h66.5V416zm-33.2-243c-21.3 0-38.5-17.3-38.5-38.5S80.9 96 102.2 96c21.2 0 38.5 17.3 38.5 38.5 0 21.3-17.2 38.5-38.5 38.5zm282.1 243h-66.4V312c0-24.8-.5-56.7-34.5-56.7-34.6 0-39.9 27-39.9 54.9V416h-66.4V202.2h63.7v29.2h.9c8.9-16.8 30.6-34.5 62.9-34.5 67.2 0 79.7 44.3 79.7 101.9V416z" fill="currentColor"/></svg> </a> <a class="a2a_button_hacker_news bix_share_button" data-type="hackernews" target="_blank" rel="nofollow"> <svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 448 512" class="svg-inline--fa" role="img" aria-hidden="true" style="color:currentColor"><path d="M0 32v448h448V32H0zm21.2 197.2H21c.1-.1.2-.3.3-.4 0 .1 0 .3-.1.4zm218 53.9V384h-31.4V281.3L128 128h37.3c52.5 98.3 49.2 101.2 59.3 125.6 12.3-27 5.8-24.4 60.6-125.6H320l-80.8 155.1z" fill="currentColor"/></svg> </a> <a class="a2a_button_reddit bix_share_button" data-type="reddit" target="_blank" rel="nofollow"> <svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 512 512" class="svg-inline--fa" role="img" aria-hidden="true" style="color:currentColor"><path d="M440.3 203.5c-15 0-28.2 6.2-37.9 15.9-35.7-24.7-83.8-40.6-137.1-42.3L293 52.3l88.2 19.8c0 21.6 17.6 39.2 39.2 39.2 22 0 39.7-18.1 39.7-39.7s-17.6-39.7-39.7-39.7c-15.4 0-28.7 9.3-35.3 22l-97.4-21.6c-4.9-1.3-9.7 2.2-11 7.1L246.3 177c-52.9 2.2-100.5 18.1-136.3 42.8-9.7-10.1-23.4-16.3-38.4-16.3-55.6 0-73.8 74.6-22.9 100.1-1.8 7.9-2.6 16.3-2.6 24.7 0 83.8 94.4 151.7 210.3 151.7 116.4 0 210.8-67.9 210.8-151.7 0-8.4-.9-17.2-3.1-25.1 49.9-25.6 31.5-99.7-23.8-99.7zM129.4 308.9c0-22 17.6-39.7 39.7-39.7 21.6 0 39.2 17.6 39.2 39.7 0 21.6-17.6 39.2-39.2 39.2-22 .1-39.7-17.6-39.7-39.2zm214.3 93.5c-36.4 36.4-139.1 36.4-175.5 0-4-3.5-4-9.7 0-13.7 3.5-3.5 9.7-3.5 13.2 0 27.8 28.5 120 29 149 0 3.5-3.5 9.7-3.5 13.2 0 4.1 4 4.1 10.2.1 13.7zm-.8-54.2c-21.6 0-39.2-17.6-39.2-39.2 0-22 17.6-39.7 39.2-39.7 22 0 39.7 17.6 39.7 39.7-.1 21.5-17.7 39.2-39.7 39.2z" fill="currentColor"/></svg> </a> <a class="a2a_button_twitter bix_share_button" data-type="twitter" target="_blank" rel="nofollow"> <svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 512 512" class="svg-inline--fa" role="img" aria-hidden="true" style="color:currentColor"><path d="M459.37 151.716c.325 4.548.325 9.097.325 13.645 0 138.72-105.583 298.558-298.558 298.558-59.452 0-114.68-17.219-161.137-47.106 8.447.974 16.568 1.299 25.34 1.299 49.055 0 94.213-16.568 130.274-44.832-46.132-.975-84.792-31.188-98.112-72.772 6.498.974 12.995 1.624 19.818 1.624 9.421 0 18.843-1.3 27.614-3.573-48.081-9.747-84.143-51.98-84.143-102.985v-1.299c13.969 7.797 30.214 12.67 47.431 13.319-28.264-18.843-46.781-51.005-46.781-87.391 0-19.492 5.197-37.36 14.294-52.954 51.655 63.675 129.3 105.258 216.365 109.807-1.624-7.797-2.599-15.918-2.599-24.04 0-57.828 46.782-104.934 104.934-104.934 30.213 0 57.502 12.67 76.67 33.137 23.715-4.548 46.456-13.32 66.599-25.34-7.798 24.366-24.366 44.833-46.132 57.827 21.117-2.273 41.584-8.122 60.426-16.243-14.292 20.791-32.161 39.308-52.628 54.253z" fill="currentColor"/></svg> </a> <a class="a2a_button_facebook bix_share_button" data-type="facebook" target="_blank" rel="nofollow"> <svg xmlns="http://www.w3.org/2000/svg" viewbox="0 0 320 512" class="svg-inline--fa" role="img" aria-hidden="true" style="color:currentColor"><path d="M279.14 288l14.22-92.66h-88.91v-60.13c0-25.35 12.42-50.06 52.24-50.06h40.42V6.26S260.43 0 225.36 0c-73.22 0-121.08 44.38-121.08 124.72v70.62H22.89V288h81.39v224h100.17V288z" fill="currentColor"/></svg> </a></div></div></div><div class="newsletter-signup-container builtin-nuxt-app"/><div class="recent-articles-container builtin-nuxt-app"/><div id="topics"><div class="field field--name-field-blog-topics field--type-entity-reference field--label-hidden field__items"><div class="field__item"><a href="/data-science" hreflang="en">Data Science</a></div><div class="field__item"><a href="/expert-contributors" hreflang="en">Expert Contributors</a></div></div></div></div> </article><div class="row-inside"/></div><div class="block block-bix-blogs block-bix-blogs-expert-contribution"><div class="block-content"><figure class="image-wrapper"><img alt="" aria-hidden="true" src="https://cdn.builtin.com/profiles/builtin/themes/bix/assets/expert-contrib-badge.svg"/></figure><div><span class="heading">Expert Contributors</span><p class="expert-text">Built In’s expert contributor network publishes thoughtful, solutions-oriented stories written by innovative tech professionals. It is the tech industry’s definitive destination for sharing compelling, first-person accounts of problem-solving on the road to innovation.</p><a href="/expert-contributors">Learn More</a></div></div></div></div></div></div><div class="row row-region-postscript"><div class="row-inside"><div class="block-region-postscript"/></div></div><div class="row row-region-bottom"><div class="row-inside"><div class="block-region-bottom"><div class="block block-bix-blogs-recruit-with-us"><div class="block-content"><h3>Great Companies Need Great People. <b>That's Where We Come In.</b></h3><a href="http://employers.builtin.com/membership?utm_medium=BIReferral&utm_source=foremployers" data-ga-event="b2b-recruit-with-us-footer" data-builtin-track-click-event="recruit_with_us_button">Recruit With Us</a></div></div></div></div></div></div></div></div></div></div> </main> <footer role="contentinfo" class="main-footer-container footer-national"><div class="footer-wrapper"><div class="footer-content"><div class="footer-column"><div class="region region-footer-first"><div id="block-footerblock" class="block block-bix-global block-bix-global-footer"><div class="builtin-logo"><a href="/home"><img loading="lazy" width="105px" height="47px" src="https://cdn.builtin.com/profiles/builtin/themes/bix/assets/builtin-logo.svg" alt="BuiltIn"/></a></div><div class="motto"><img loading="lazy" width="177px" height="19px" src="https://cdn.builtin.com/profiles/builtin/themes/bix/assets/icons/icons/united-we-tech.svg" alt="United We Tech"/></div><div class="blurb">Built In is the online community for startups and tech companies. Find startup jobs, tech news and events.</div><div class="social-links"><a href="https://facebook.com/builtinhq" target="_blank"><svg aria-hidden="true" focusable="false" data-prefix="fab" data-icon="facebook" role="img" xmlns="http://www.w3.org/2000/svg" viewbox="0 0 512 512"><path fill="#fff" d="M504 256C504 119 393 8 256 8S8 119 8 256c0 123.78 90.69 226.38 209.25 245V327.69h-63V256h63v-54.64c0-62.15 37-96.48 93.67-96.48 27.14 0 55.52 4.84 55.52 4.84v61h-31.28c-30.8 0-40.41 19.12-40.41 38.73V256h68.78l-11 71.69h-57.78V501C413.31 482.38 504 379.78 504 256z"/></svg></a><a href="https://twitter.com/builtin" target="_blank"><svg aria-hidden="true" focusable="false" data-prefix="fab" data-icon="twitter" role="img" xmlns="http://www.w3.org/2000/svg" viewbox="0 0 512 512"><path fill="#fff" d="M459.37 151.716c.325 4.548.325 9.097.325 13.645 0 138.72-105.583 298.558-298.558 298.558-59.452 0-114.68-17.219-161.137-47.106 8.447.974 16.568 1.299 25.34 1.299 49.055 0 94.213-16.568 130.274-44.832-46.132-.975-84.792-31.188-98.112-72.772 6.498.974 12.995 1.624 19.818 1.624 9.421 0 18.843-1.3 27.614-3.573-48.081-9.747-84.143-51.98-84.143-102.985v-1.299c13.969 7.797 30.214 12.67 47.431 13.319-28.264-18.843-46.781-51.005-46.781-87.391 0-19.492 5.197-37.36 14.294-52.954 51.655 63.675 129.3 105.258 216.365 109.807-1.624-7.797-2.599-15.918-2.599-24.04 0-57.828 46.782-104.934 104.934-104.934 30.213 0 57.502 12.67 76.67 33.137 23.715-4.548 46.456-13.32 66.599-25.34-7.798 24.366-24.366 44.833-46.132 57.827 21.117-2.273 41.584-8.122 60.426-16.243-14.292 20.791-32.161 39.308-52.628 54.253z"/></svg></a><a href="https://www.instagram.com/builtin" target="_blank"><svg aria-hidden="true" focusable="false" data-prefix="fab" data-icon="instagram" role="img" xmlns="http://www.w3.org/2000/svg" viewbox="0 0 448 512"><path fill="#fff" d="M224.1 141c-63.6 0-114.9 51.3-114.9 114.9s51.3 114.9 114.9 114.9S339 319.5 339 255.9 287.7 141 224.1 141zm0 189.6c-41.1 0-74.7-33.5-74.7-74.7s33.5-74.7 74.7-74.7 74.7 33.5 74.7 74.7-33.6 74.7-74.7 74.7zm146.4-194.3c0 14.9-12 26.8-26.8 26.8-14.9 0-26.8-12-26.8-26.8s12-26.8 26.8-26.8 26.8 12 26.8 26.8zm76.1 27.2c-1.7-35.9-9.9-67.7-36.2-93.9-26.2-26.2-58-34.4-93.9-36.2-37-2.1-147.9-2.1-184.9 0-35.8 1.7-67.6 9.9-93.9 36.1s-34.4 58-36.2 93.9c-2.1 37-2.1 147.9 0 184.9 1.7 35.9 9.9 67.7 36.2 93.9s58 34.4 93.9 36.2c37 2.1 147.9 2.1 184.9 0 35.9-1.7 67.7-9.9 93.9-36.2 26.2-26.2 34.4-58 36.2-93.9 2.1-37 2.1-147.8 0-184.8zM398.8 388c-7.8 19.6-22.9 34.7-42.6 42.6-29.5 11.7-99.5 9-132.1 9s-102.7 2.6-132.1-9c-19.6-7.8-34.7-22.9-42.6-42.6-11.7-29.5-9-99.5-9-132.1s-2.6-102.7 9-132.1c7.8-19.6 22.9-34.7 42.6-42.6 29.5-11.7 99.5-9 132.1-9s102.7-2.6 132.1 9c19.6 7.8 34.7 22.9 42.6 42.6 11.7 29.5 9 99.5 9 132.1s2.7 102.7-9 132.1z"/></svg></a><a href="https://www.linkedin.com/company/3763094" target="_blank"><svg aria-hidden="true" focusable="false" data-prefix="fab" data-icon="linkedin" role="img" xmlns="http://www.w3.org/2000/svg" viewbox="0 0 448 512"><path fill="#fff" d="M416 32H31.9C14.3 32 0 46.5 0 64.3v383.4C0 465.5 14.3 480 31.9 480H416c17.6 0 32-14.5 32-32.3V64.3c0-17.8-14.4-32.3-32-32.3zM135.4 416H69V202.2h66.5V416zm-33.2-243c-21.3 0-38.5-17.3-38.5-38.5S80.9 96 102.2 96c21.2 0 38.5 17.3 38.5 38.5 0 21.3-17.2 38.5-38.5 38.5zm282.1 243h-66.4V312c0-24.8-.5-56.7-34.5-56.7-34.6 0-39.9 27-39.9 54.9V416h-66.4V202.2h63.7v29.2h.9c8.9-16.8 30.6-34.5 62.9-34.5 67.2 0 79.7 44.3 79.7 101.9V416z"/></svg></a></div></div></div></div><div class="footer-links"><div class="region region-footer-third"> <nav role="navigation" aria-labelledby="block-about-menu" id="block-about" class="block block-menu navigation menu--about"><div class="box-title" id="block-about-menu">About</div><ul class="menu"><li class="menu-item"> <a href="https://builtin.com/our-story" target="_blank">Our Story</a></li><li class="menu-item"> <a href="https://employers.builtin.com/careers/" target="_blank">Careers</a></li><li class="menu-item"> <a href="https://builtin.com/our-staff" target="_blank">Our Staff Writers</a></li><li class="menu-item"> <a href="https://builtin.com/content-descriptions" target="_blank">Content Descriptions</a></li><li class="menu-item"> <a href="https://employers.builtin.com/newsroom/" target="_blank">Company News</a></li></ul> </nav></div><div class="region region-footer-second"> <nav role="navigation" aria-labelledby="block-getinvolved-menu" id="block-getinvolved" class="block block-menu navigation menu--get-involved"><div class="box-title" id="block-getinvolved-menu">Get Involved</div><ul class="menu"><li data-builtin-track-click-event="click_recruit_wbuiltin_footernav_link" class="menu-item menu-item--under-line"> <a href="https://employers.builtin.com/membership?utm_medium=BIReferral&utm_source=foremployers" target="_blank">Recruit With Built In</a></li><li class="menu-item"> <a href="https://builtin.com/newsletter" target="_blank">Subscribe to Our Newsletter</a></li><li class="menu-item"> <a href="https://builtin.com/expert-contributors" target="_blank">Become an Expert Contributor</a></li><li class="menu-item"> <a href="/send-us-tip" target="_blank">Send Us a News Tip</a></li></ul> </nav></div><div class="region region-footer-fourth"> <nav role="navigation" aria-labelledby="block-resources-menu" id="block-resources" class="block block-menu navigation menu--resources"><div class="box-title" id="block-resources-menu">Resources</div><ul class="menu"><li class="menu-item"> <a href="https://knowledgebase.builtin.com/s/">Customer Support</a></li><li class="menu-item"> <a href="https://form.jotform.com/223044927257054">Share Feedback</a></li><li class="menu-item"> <a href="https://knowledgebase.builtin.com/s/contactsupport">Report a Bug</a></li><li class="menu-item"> <a href="/tech-dictionary" data-drupal-link-system-path="tech-dictionary">Tech A-Z</a></li><li class="menu-item"> <a href="https://builtin.com/browse-jobs">Browse Jobs</a></li></ul> </nav></div><div class="region region-footer-fifth"> <nav role="navigation" aria-labelledby="block-poweredbybuiltin-menu" id="block-poweredbybuiltin" class="block block-menu navigation menu--powered-by-built-in"><div class="box-title" id="block-poweredbybuiltin-menu">Tech Hubs</div><ul class="menu"><li class="menu-item"> <a href="https://www.builtinaustin.com" target="_blank">Built In Austin</a></li><li class="menu-item"> <a href="https://www.builtinboston.com" target="_blank">Built In Boston</a></li><li class="menu-item"> <a href="https://www.builtinchicago.org" target="_blank">Built In Chicago</a></li><li class="menu-item"> <a href="https://www.builtincolorado.com" target="_blank">Built In Colorado</a></li><li class="menu-item"> <a href="https://www.builtinla.com" target="_blank">Built In LA</a></li><li class="menu-item"> <a href="https://www.builtinnyc.com" target="_blank">Built In NYC</a></li><li class="menu-item"> <a href="https://www.builtinsf.com" target="_blank">Built In San Francisco</a></li><li class="menu-item"> <a href="https://www.builtinseattle.com" target="_blank">Built In Seattle</a></li><li class="menu-item"> <a href="https://builtin.com/tech-hubs" target="_blank">See All Tech Hubs</a></li></ul> </nav></div></div></div><div class="cityscape"/><div class="region region-footer-bottom"><div id="block-copyrightblock" class="block block-bix-global block-bix-global-copyright"> © Built In 2024</div><nav role="navigation" aria-labelledby="block-footerbottom-menu" id="block-footerbottom" class="block block-menu navigation menu--footer-bottom"><ul class="menu"><li class="menu-item"> <a href="https://builtin.com/learning-lab-user-agreement">Learning Lab User Agreement</a></li><li class="menu-item"> <a href="https://builtin.com/accessibility-statement">Accessibility Statement</a></li><li class="menu-item"> <a href="https://builtin.com/copyright-policy">Copyright Policy</a></li><li class="menu-item"> <a href="https://builtin.com/privacy-policy">Privacy Policy</a></li><li class="menu-item"> <a href="https://builtin.com/community-terms-of-use">Terms of Use</a></li><li class="menu-item"> <a href="https://builtin.com/california-do-not-sell-my-information" class="do-not-sell">Do Not Sell My Personal Info</a></li><li class="menu-item"> <a href="https://builtin.com/ca-notice-collection">CA Notice of Collection</a></li></ul> </nav></div></div> </footer></div><div class="overlay"/></div></body>
推荐文章
活泼的足球
·
湖北咸宁:交通建设提速增效 区域联通路网升级|咸宁市_新浪财经_新浪网
2 月前
忧郁的大象
·
克罗斯告别皇马主场,能否复制恩师齐达内的轨迹?
2 月前
一身肌肉的烤红薯
·
对象'DF__*‘依赖于列'*’-将int改为double-腾讯云开发者社区-腾讯云
4 月前
酷酷的煎鸡蛋
·
origin/src/main/java/org/codeyn/util/yn/StrUtil.java at master · efsn/origin · GitHub
4 月前
腹黑的企鹅
·
alert = driver().switch_to.alert alert.accept() throws NoAlertPresentException · Issue #128 · yashak
5 月前