Industry: Technology
Role: Product Design
Year: 2017
BACKGROUND
IBM Developer Advocacy is an online resource for developers to find developer tools and services. We were asked to redesign and optimize the site to help users find developer tools quicker and much more efficiently.
My Role & Team
I was the Product Designer, conducting user research, interaction and visual design. We also had a product manager, helping define design requirements, project roadmap, design reviews, and dev check ins. Also on the team, we had two developers at Idean that helped build the product.
Content Audit
We conducted a content audit to understand how content was performing. Using Google Analytics, we were able to measure page views, unique page views, average time spent, bounce rates and other metrics. We chose a fixed time period of July - November 2015 to get a long term understanding how content was performing over the long term.
Insight Driven Designs
We learned that the Learning Center drove most and highest quality of traffic in terms of time spent. Knowing this we focused on how users access these pages the most. Second was Search and we learned that users would feel comfortable and typically searched for specific developer tools.
Wireframes
We focused on funneling traffic to popular areas like the Learning Center and having Search function at the top. We supported business goals by displaying featured content and resources that would be dynamically sorted according to popularity.
Responsive Design
Our designs needed to be responsive. We utilized Bootstrap framework with typical breakpoints at 768px for small devices, 922px for medium size devices and 1200px for large devices.
Visual Design Explorations
We had to follow IBM's Brand Guidelines and we were given a little freedom in creative direction. We communicated closely with their team to guide color selection, however I was inspired with primary deep blue and bright secondary colors.
RESPONSIVE
Visual Designs
Outcome
After the design was implemented, our team wanted to know how they affected user behavior. We took samples from the same time frames of 4 months with no events or campaigns that could affect our data.
Analytics informed us that from home page we had 74% increase in sessions from 2.7K to 4.7K sessions. We also noticed 172% increase in the first interaction from 477 to 1.3K sessions. We did notice a slight decrease in drop-offs after the 2nd interaction. Our team was unsure of why but if we had more time and scope, we would definitely try to improve upon the 2nd, 3rd and 4th interactions.