Member/Customer Experience

Resume Assistant: The Collaboration Between Microsoft and LinkedIn

resumeassist1

In November 2017, Microsoft and LinkedIn introduced an exciting integration between the companies: Resume Assistant. Finding the right words to describe your work experience, knowing what skills you should highlight to attract employer interest—these can be challenging tasks. As shared in a post by the Microsoft team last fall, “nearly 70 percent of people say they have difficulty portraying their work experience effectively, and 50 percent struggle to tailor their resume to a specific job opportunity.” Resume Assistant provides intelligent tools to help job seekers improve their resumes, right from within Microsoft Word.

In the first part of this two-blog series, we’ll discuss the development and design process of Resume Assistant. This blog will talk about the engineering and design challenges we faced during product creation and how we solved them together with the team at Microsoft. Stay tuned for Part 2, where we’ll share more on the AI model used to pull in the best content possible for users of Resume Assistant.

High-level architecture

resumeassist2

Resume Assistant was built as a hybrid feature inside Word applications by both LinkedIn engineers and Microsoft engineers. The Microsoft team took ownership of everything that happens in and is heavily-embedded in Word, e.g., Resume Classifier, Resume Info Extractor, the onboarding page, etc., while LinkedIn engineers focused on building the main UI, service, and relevance model of the Resume Assistant feature.

Let’s introduce Resume Classifier and Resume Info Extractor first. The Resume Classifier detects if a user opens a resume in Word and then triggers the opening of the LinkedIn Resume Assistant onboarding page. Though the Resume Assistant onboarding page is a very simple, static web page with a button to ask for the user’s consent, it is a key component of protecting the user’s privacy. Unlike other pages in Resume Assistant, this onboarding page is hosted by Microsoft. Without the user’s consent from this onboarding page, Word users won’t enter the LinkedIn-powered Resume Assistant experience, and no user’s data is sent to LinkedIn.

After the user consents to use Resume Assistant, the Resume Assistant retrieves the job title and other information that has been extracted by the Microsoft Resume Info Extractor. The Resume Assistant gets this data through the Office JavaScript API. Automatically retrieving this data smooths the user experience in Resume Assistant. For example, the user’s title/locale is used to pull the right set of work experience examples to display in the panel, along with a set of suggested skills to cover in the resume and a list of jobs that may be of interest to the user. These suggestions are based on LinkedIn’s deep understanding of its professional network and jobs market.

With the Microsoft team implementing the Resume Classifier and Resume Info Extractor components, how could LinkedIn engineers build the Resume Assistant service into Word without learning Word development from scratch?

Leveraging Office Add-in Platform

Usually, Word features are developed natively. If we went with a native approach, the Resume Assistant UI would talk to Microsoft’s data centers, which would in turn talk to LinkedIn to fetch data via Rest.li Gateway (Rest.li is a REST+JSON framework for building robust, scalable service architectures. Rest.li Gateway is LinkedIn’s API externalization platform). For example, LinkedIn created an Outlook profile card integration with code written directly in Outlook that talks to LinkedIn via Microsoft’s data centers.  

Another approach that was available was to develop on the Office Add-in Platform. The Office Add-in Platform exposes and bridges the native API through a JavaScript library. This gives developers the ability to build additional functionality for Office products through the use of web technologies. With this, we would be able to create a single page application entirely on the LinkedIn stack that could still interface with the Word document.

Creating our experience as an Office Add-in would allow us to iterate on it much more quickly than a native implementation, because it would allow us to utilize our own web deployment pipeline. It’s a lot easier to update a web application frequently than it is to update a native application installed on someone’s machine. We can push changes to a web application more seamlessly than to a native application because a user does not need to be involved in downloading code. And since it’s a web application, we could implement the Resume Assistant web code once, and all Word applications (Word for Windows, Word for Mac, and Word Online) would be able to show the tool.

Additionally, code written in Word would be owned and developed by Microsoft engineers, while the LinkedIn APIs would be developed by LinkedIn engineers. Using the Office Add-in Platform would eliminate some of the interfacing and coordination challenges of the native approach.

From a site speed perspective, it was not clear which option was best (without building both and comparing). A native experience would load faster initially—our integrations code would be bundled as part of the Word application that is downloaded by the user once. However, when accessing LinkedIn data via an HTTP call, we would have an additional hop between the Microsoft and LinkedIn data centers, increasing our latency for subsequent page loads.

For the Resume Assistant tool, we decided to implement the product as an Office Add-in because it allows LinkedIn to iterate more quickly. This was one of our first Microsoft integrations, so we didn’t know for sure what would work best for the product. With the plugin approach, we have the flexibility to quickly make changes as needed. At LinkedIn, we have learned that making some simple UI changes can have a big impact on user engagement, so it’s important to have the flexibility to deploy UI changes quickly.

Choosing a frontend tech stack within LinkedIn

At LinkedIn, we have two main stacks for frontend external-facing applications. The first stack is called Pemberly. This stack uses a Java Play API server to provide Rest.li data to our UI code. Our frontend was written in Ember.js and is served from another Play server that uses a pool of node processes to make optimizations for first page load. These optimizations include streaming the initial API call data along with the HTML and Ember application, and performing server-side rendering. This stack is primarily used for our rich, member-facing applications, like the logged-in LinkedIn.com. Its main advantage is the use of Ember to create a Single Page Application (SPA) in the web browser.

The other frontend stack we use is a simple Play server returning server-side rendered Dust templates. This stack is primarily used for SEO purposes and for our guest webpages, where site speed is king.

For our Office integration, we felt like the Pemberly stack (first option) was the best approach. Using Ember allows us to build a very rich UI by getting a lot of Single Page Application features (like routing) for free and make the Resume Assistant experience more of a native experience.

Powering Resume Assistant with LinkedIn data

Through user studies, we discovered that job seekers have trouble finding the right words to describe their experience on their resumes. To help prompt users in Resume Assistant, we suggest deidentified work experience descriptions, derived from public LinkedIn profiles. The data is only obtained from member profiles where the members have chosen to keep their profile visibility to public, and where the member’s position description is also visible publicly. Additionally, it is possible to completely opt out of providing this information, by switching the “Microsoft word” setting to off, from the settings page. To power this feature, we needed to create a new LinkedIn backend that would use all of our public profile information to choose good descriptions for what to write on a resume, given a job title. To implement this, we considered two options.

First, we could create a new search stack to power a relevance service for this feature. This option provided the best long-term flexibility, because we could perform all sorts of queries easily. LinkedIn has experience with search—for example, we already have people search and job search functions. However, we could not easily leverage these existing search services because they had different goals. For example, people search favors full profiles relevant to a query. But for Resume Assistant, we would prefer to favor profiles with at least one good position description relevant to the input title. A new search stack would involved a lot of different new infrastructure pieces, and likely more SRE support to maintain the service.

The approach we decided to take instead is much easier to implement initially, but provides less future flexibility. Instead of a search stack, we created a new key-value store to hold a mapping of titles to a list of work experience descriptions. We choose to use Venice, a LinkedIn-derived database similar to Voldemort Read-Only. We populated the Venice store with an offline Hadoop script that uses public profile data to derive a title for each work experience description, and we then ranked the positions into an ordered list. With this data being stored, requests to query the work experience examples can be served as simply as a key-value lookup.

Accessibility

Providing an accessible and inclusive experience to all users is a fundamental belief of both Microsoft and LinkedIn. We took this belief to heart during the development of the Resume Assistant. We worked closely with LinkedIn and Microsoft’s accessibility teams to identify, triage, and ultimately fix issues raised during the UI development cycle of the project.

We stumbled upon a unique set of accessibility problems during development that were due to the numerous platforms the pane would render on. Because the application rendered in a webview in native work on both MacOS and Windows, the most difficult issues we encountered involved screen readers inconsistently treating our app as either native or as a web document. With assistance from both the SDX and accessibility teams (and a lengthy investigation), we were able to mitigate the inconsistency and have the pane behave the way a user of a screen reader would expect.

resumeassist3

Overcoming the learning curve

Being one of the first major integrations post-acquisition, there was a bit of a learning curve when it came to blending LinkedIn’s design patterns with those of Word. The team decided to leverage the pre-existing fonts, colors, and pane dynamics native to Word, while infusing LinkedIn’s iconography, typographical hierarchy, and layouts into the experience. The next hurdle the design team had to overcome was figuring out how to best work together and collaborate from a design standpoint. Due to the incompatibility of the choice of design tools between Linkedin and Microsoft engineers, It was decided early on that LinkedIn would own the design of the Resume Assistant. The Word team was then able to play more of a consultant role, providing feedback and guidance on designs and strategy.

Leveraging user research

Another learning curve the team overcame was that of user research. Unlike other LinkedIn products, Resume Assistant is a Microsoft Word feature (powered by LinkedIn), so we had to make sure that the product was appealing and valuable to both LinkedIn and non-LinkedIn members. LinkedIn researchers ran qualitative (in-depth interviews with a small number of participants) and quantitative (survey-style questionnaire sent to a large number of participants) studies with both groups to ensure that our product and designs would meet their needs. On the Microsoft side, we were able to leverage their weekly flash-feedback sessions, where Microsoft users are brought in to attend 15-20 minute interviews with multiple teams testing various products. The flash-feedback sessions allowed us to gather quick and lightweight insights on a very regular basis. Overall, it was an incredible collaborative effort between our companies to get the Resume Assistant in front of users throughout the various stages of design. The feedback we received led to valuable insights that helped to shape the product into what it is today.

Next

Though the infrastructures, tools, and best practices are different at LinkedIn and Microsoft, we worked together to leverage the best assets of each company, creating a product to help the world’s professionals advance their careers and land their dream jobs.

Acknowledgements

The Resume Assistant would not be possible without the great people behind it. Thanks to Bradley Walker, George Pearman, Nadine Rao, Pedro Fernando Márquez Soto, Brian Cox, Kunal Cholera, Hang Zhang, Deirdre Hogan, Haoran Wang, Amy Tremper, Mukul K., Jason Speck, Kylan Nieh, Victoria Novikova, Patrick Corrigan, Slava Dubodelov, Joseph Florencio, William West, Sunil Mahadeshwar, Yuji Kosuga, Maria del Mar Ginés Marín, Alfredo Arnaiz, Nick Cipollone, Kevin Powell, Miki Suzuki, Tev’n Powers, Michael Daniels, Katie Sullivan, Ali Taleghani, Leah Brown, Kevin Black, Sean Oldridge, Simon Villeneuve, Dariusz Adamczyk, TJ Howard, Laura Licari, Arun Lakshmanan, Brent Lang, Javier Delgado, Swathi Jayavel, Scott Hysom, Sudheer Maremanda, Humberto Lezama Guadarrama, Alexandru Croicu, Sergey Shepshelevich, Alin Flaidar, Barry McHugh, Apurv Suman, Shikha Desai and a lot more for your great work!