AI Standardization Content Understanding AI Infrastructure

 

Through AI and humans working together in harmony, the Standardization team takes raw, unstructured data and produces standardized data that is human interpretable, less ambiguous and has proper semantic structure. We build the full life cycle of data including ingesting data from Web and other sources, and developing AI models and data infrastructure to extract/infer economic entities. Data is organized in the LinkedIn Economic Graph and served across the entire LinkedIn ecosystem, including recommendations, search, and data insights.

Machine learning tasks include information extraction from unstructured and semi-structured sources, sequence labeling, multi-label text classification, entity linking, and knowledge graph embeddings.

 

 

We build technology to understand what our more than 675 million members are interested in, and apply this technology to enhance their LinkedIn experience. We blend expertise in deep learning, natural language processing, software engineering, evaluation and human judgment to develop and deploy topic extraction, document classification, named entity recognition, entity linking and information extraction technologies at scale. Our work impacts the LinkedIn feed, personalized content recommendation, ads relevance, and more.

 

 

LinkedIn depends more and more on AI and Machine Learning to deliver value to our members and customers. Our team is working to grow LinkedIn’s Economic Graph in ways that reach and impact every member of the global workforce equally - no matter where you are on the planet. To achieve this we need to scale our infrastructure to new heights.