Creating Video Sharing on LinkedIn
How the Engineering Team Perfected Cross-Office, Cross-Time Zone Collaboration
August 23, 2017
When we set out to build the new native video experience, we knew several things needed to occur in order to ensure a successful launch. First, video sharing needed to be seamlessly integrated into our product experience for members. Millions of members post every week on LinkedIn, and we wanted posting video to fit within their existing site experience, so we needed systems that could scale easily as members adopted it. Second, we knew that the relevance and quality of content would be paramount. Finally, we wanted to start learning from our members’ feedback and iterate quickly, so speed of execution was going to be critical.
Building a successful video sharing experience within an ecosystem as complex as LinkedIn’s systems and as quickly as possible required a few key ingredients: effective planning, team alignment, and open communication. Pulling this off involved creating alignment between a large core product engineering team, large-scale systems, and partner teams who would support our feature in the broader ecosystem. In all, the effort involved nearly two dozen teams based in four different office locations. In this post, I want to share some of the lessons we learned about how to execute such a complex project.
The technical expertise to build our video platform was split across a number of teams in different locations: video product engineering in our New York office, content infrastructure in San Francisco, video processing and storage in Sunnyvale, and spam/low-quality content detection and moderation tooling based in our Bangalore office. This meant a core product engineering team split across the U.S., supported by the infrastructure and nearly 20+ supporting teams (everything from feed relevance to notifications to CDN) on two continents.
We wanted to allow these teams to have the autonomy to move quickly on their own, but at the same time needed to understand how—and when—it would all come together. To do so, we began with architecture planning, bringing together engineering leads from all the offices to develop a shared understanding of the overall architecture, data modeling, high-level integration points, and systems communication.
Once the engineering leads came to a shared understanding of how the individual systems would need to interact with one another, the teams developed, prioritized, and scoped their own roadmap for the features and systems for which they were responsible. This first step in planning allowed smaller groups of the right stakeholders to come together as needed to work out some of the more complex integration points. For example, developing multi-part upload functionality to make uploads faster and more resilient required close collaboration between the mobile client teams developing the product experience, the mobile infrastructure team responsible for networking libraries, and the video infrastructure team building the upload and processing pipeline. A thorough understanding of the entire end-to-end flow allowed these teams to come together to design a solution and then quickly implement their various pieces in parallel to pull together a seamless experience for our members.
With each team having a clear understanding of its charter and timeline, we were able to understand each team’s critical path. Therefore, we could establish integration and product milestones with high confidence.
While the planning stages above may seem like obvious things to implement into any large project, that doesn’t mean they’re easy to achieve. For example, the biggest challenges in aligning the roadmaps of the teams were managing dependencies and communicating status between teams. Further complicating this, some teams had single integration points, whereas others had many points of integration along the way. Getting the right level of organization and communication was key because this project required collaboration that we typically don’t see in launching a product of this size. For the most part, engineering teams are used to working with the folks sitting next to them. When they need to work with people in other offices, it’s usually just for an integration and/or one-off task. For this project, everyone needed to adjust their style to have regular communication with people from different offices and often different time zones.
To begin, we knew that everyone involved needed to start with an understanding of why video was so important to LinkedIn and our members. We began the project with a kickoff meeting where every team involved shared the vision for video on LinkedIn and walked through an overview of the product scope and key design components. Moving forward, we established a weekly scrum-of-scrum, in which we brought together a representative from each of the twenty teams to report on progress, flag risks, and coordinate on removing blockers. We used a shared spreadsheet to track each team’s high-level progress so that everyone could easily see where their dependencies stood. Each week, we then sent out a project status email to key executive stakeholders, as well as the entire team, highlighting progress and identifying risks with which we needed assistance.
As the core product engineering team building the member-facing product experience, we met for bi-weekly town hall meetings to discuss priorities as a team. We used this time to share findings from user research, discuss new designs, and provide updates on the go-to-market, with the goal of keeping the member problems we were solving top of mind. Additionally, the tech leads from sub-teams within the core product engineering team (e.g., web player, iOS creation, tracking, etc.) would connect weekly to share status and call out risks, which could then be bubbled to the larger group.
On a day-to-day basis, we used a number of tools, including messaging clients for real-time communication across teams, frequent video conferences when folks needed to get in a room to hash something out, documentation available on internal wikis, and lots of JIRA dashboards to track progress of more granular tasks.
Trust and open communication
As with any complex project, bumps in the road came up along the way—architecture changes, disagreements on implementation, unforeseen issues that affected timelines, etc. Ultimately, trust and open communication allowed the team to keep the project on track despite these challenges. Effective working relationships are the key to any team success, but when working on a project with 50+ people across multiple geographies, that task can be difficult. One thing we found was that, when investing in building real relationships amongst team members, nothing replaces face time. Team leads frequently traveled from office to office, spending a week at a time with the various teams, to get to know the individuals on the teams they were working with better, and even at times working side-by-side and pair programming.
As a project leadership team, we met for 30 minutes three times a week for a meeting with no fixed agenda other than open, candid discussion about how the project was progressing. This helped build a relationship among the leads and the trust necessary to question assumptions and call out risks without fear of judgment or assignment of blame.
A team effort
Bringing video to LinkedIn was a massive effort. It also has been the smoothest product launch of my career. While planning, alignment, and communication played critical roles, really it was the passion and dedication of the team that brought it from concept to reality. But we’ve only just crossed the starting line. As the team continues to think about improving the video experience for our members, the foundation and relationships they have built during product development will be monumental to the continued success of video on LinkedIn.
Special thanks to New York and San Francisco product engineering, Vector, Feed, Publishing Infrastructure, Editorial Voice, Sitespeed, CDN, Spam Relevance, Security, Mobile Infrastructure, and all other supporting teams that made video sharing a reality.