The Sam Haveson Hypothesis: The Artfulness of Product Management is Identifying Customer Needs
In this episode of the Product Science Podcast, we cover Sam’s career in product, testing and experimentation at different scales from startup to enterprise, and how to do real time experiments to measure progress.
Holly Hester-Reilly
Sam Haveson is a Senior Product Lead on the Consumer Product team at Twitter. Sam has defined, launched and scaled products that help millions of people create and converse on Twitter. Prior to Twitter, Sam was a Senior Product Manager at Amazon building Amazon Photos experiences for Amazon Alexa devices. She holds an MBA from Cornell Tech, where she is an adviser to the programs graduate students. Sam is also a writer and musician based in San Francisco, CA.
In this episode of the Product Science Podcast, we cover Sam’s career in product, testing and experimentation at different scales from startup to enterprise, and how to do real time experiments to measure progress.
Subscribe for the full episode on Apple, Google Play, Spotify, Stitcher, and more. Love what you hear? Leave us a review, it means a lot.
Resources
Questions We Explore in This Episode
How did Sam start her journey into the world of product? Why is product a good career option for people who want to be involved in multiple disciplines? How is product at the cross section of people, technology, and creativity? How can working for startups help expose you to the different skills a product manager needs to have? Why did Sam pursue an MBA at Cornell Tech? How did Sam get started at Amazon? How was Sam able to cultivate relationships to further her career? What attracted Sam to a company like Amazon? Why did Sam move onto work at Twitter?
When working for a startup, how much access can you have to the rest of a business’s operations? Why is being a product manager more of a hands on role? What is it like being part of an internationally based startup? How does product culture differ between different regions? What does research and experimentation look like for companies at different sizes and scales? ****What is experimentation like at a startup? How does research and experimentation look like in a bootstrapped environment? What is the same about user research across businesses of all sizes?
How do Amazon & Twitter measure customer satisfaction in real time? How does Amazon measure customer satisfaction and how does it help measure progress? How can customer satisfaction help you measure progress at scale? How does a customer satisfaction score reveal different information than traditional surveys do? How do you test progress and if you’re your products are improving? How does Twitter use behavioral analysis in their user research? Why does Twitter’s access to information help create internal improvements? How does Twitter measure cause and effect through applying behavioral analysis to large scale real time data? How did Twitter embrace user work arounds as an indicator for how to improve a feature? How does Twitter find product opportunities when analyzing trends?
How did Twitter strategically innovate from 140 characters to including images and gifs. How did Twitter balance adding new features while not compromising on their core features. How does Twitter find users to test experiments with, and how do they scale their experiments? Why is it important for product teams to experience issues with a product as common users? How does building for a small sample of users help promote benefits for users at scale? How does Twitter handle scalable A/B tests for a wide range of metrics? What are some methods for knowing what sample size to test for? How can product managers stay hands on with users at scale? How can process be used to keep the whole product team close to the customer? How does Twitter make A/B testing available to all product teams. How do Twitter’s product teams validate their experiments? How can teams test for causal behavioral changes over a wide range of experiments?
How are product teams structured at Twitter? How are all members of the product team at Twitter equally owners of a product? How does Twitter’s use of remote work help empower its employees? How can remote work allow for better more productive teams? How does having a diversity in the forms of information help make it more accessible to all teams? What are different learning styles and how do these affect how we communicate? How does a company’s size affect your impact and ownership of a product? How can you realize what skills you are best at in product? How do you create a career path to develop new skills in product?
Quotes from this episode
To build something that a small group of people will love, will get you to a better place so that a larger group of people can eventually like it and then maybe love it.
Authenticity is a really important ingredient in the recipe of doing great work.
The artfulness of doing product management is being able to key-in on what this customer need is that we're observing.
Transcription
Holly:
This week on the Product Science Podcast. I'm excited to share a conversation with Sam Haveson. Sam is a Senior Product Lead on the Consumer Product team at Twitter. Sam has defined, launched, and scaled products that helped millions of people create and converse on Twitter. Prior to Twitter, Sam was a senior product manager at Amazon, building Amazon photos experiences for Amazon Alexa devices. She holds an MBA from Cornell Tech where she is an advisor to the program's graduate students. Sam is also a writer and musician based in San Francisco, California. Welcome, Sam.
Sam Haveson:
Hi Holly. It's great to be here. Thanks for having me.
Holly:
Yeah, thank you for being on the podcast. So I always like to start by hearing a bit about people's journeys into products. So tell me a bit about how you entered the product world.
Sam Haveson:
Yeah, sure. So I would love to open up just sharing a little bit about myself. I grew up in New Jersey. I was raised in a family that allowed me to explore many different disciplines. I did everything from arts, sports, music, and I always found that I was really a generalist in being able to explore different interests. The first time I was really enthralled by technology was when I was growing up spending hours in Microsoft Paint, making graphic design posters, and I'm sort of being inspired by that lens. I then went on to teach myself how to code and was doing a lot of web development for my own projects and eventually learning how to make those skills useful for helping others. Something that became very clear to me was that this intersection of people, technology, and creativity would become themes throughout the rest of my life.
When I went on to college, I was excited to be able to be exposed to different opportunities. While in college, I got to capitalize on my interests by working for a few startups. One was a social networking app based in New York. Another was a mobile video messaging app based in Israel where I actually got the chance to live and work for a summer, getting a front row seat to the Start-up Nation. And at these startups, I was able to wear many hats, diving across strategy, product development, brand marketing, management, and really harnessing more of my technical skills. And ultimately, it gave me the confidence to look at myself and see what entrepreneur I really had inside. And that led me to opening up my own web design company, Social Focus. I had this entrepreneurial bone in my body and it was excited to explore the journey of opening an LLC, picking up clients, helping them get their own web presences online.
And through working for these startups and then having my own entrepreneurial experience myself, I was exposed to a ton of interesting opportunities in people. One summer, I got accepted to a leadership program at Google and I was able to go to Silicon Valley, see the scale of products and talent there. It made me realize that I really wanted to develop a career within the walls of a bigger company, like a Google.
After doing some research, I came to realize that in order to level up, I might want to consider business school. I wasn't interested in any traditional business school programs. I was really looking for sort of a niche program that could offer the skills that I needed. And I was fortunate to come across a program being offered by Cornell Tech. It was this amazing confluence between business, technology, product management, and entrepreneurship. The program was built in New York city. And in 2017, they took a chance on me, appreciating my background. And so I joined that third inaugural MBA class.
Cornell Tech is a super special place. Throughout their product curriculum, they have this one aspect called Product Studio. And that's where they enlist startups and companies in Fortune 500 to submit challenges to work with students. So I had the opportunity there while I was an MBA to provide consultancy to a company that I deeply admired, which was Amazon. I was partnering with an Amazon director and a group of students on a challenge, specifically it was focused on AR and VR and how to illuminate the Amazon photos experience using some of that technology. That consultancy was an amazing experience. It really was an externship for me. And through the relationship that I built with that director at the time of my MBA, we were able to cultivate a relationship. And that led me through the process of actually considering an opportunity at Amazon.
I'm really grateful to have had the opportunity to get hired at Amazon. That was my first product experience. I was hired in as a senior product manager after my MBA. I joined that team, the team that I was providing consultancy for, which was the Amazon photos team. And I worked at the intersection of the Alexa-enabled devices. So everything that sort of powers a screen and taking the best of your photos catalog and bringing to life some of those experiences using voice. It was a great experience learning and growing at Amazon and I can go into those details. But the company is very large, right? And so having been at Amazon, I realized that I would be really interested in kind of going to a smaller company, but something that still provided me with the scale and impact that I was having within my work.
So I was very fortunate to be presented with an opportunity to interview at Twitter. Twitter was a company at the time that was a lot smaller than Amazon, but had enormous impact in terms of the customers it served and the sort of global footprint. So I had been using the Twitter product for a decade at the time that I was interviewing so I knew it at the back of my hand and I was excited to kind of think about making impact on the service. So that led me into my next product role, which was at Twitter. I was hired there on their consumer product team. I've had just a wealth experience there almost coming up on four years, where today I'm a senior product lead on the consumer product team, on our creation and conversations team. And so that has been my journey into product management. It's been a ride.
Holly:
Yeah, what a journey. Let's go back earlier on in the journey. I'd love to hear a little more about some of the startups that you got involved in when you were in college. What was the sort of scene like for those?
Sam Haveson:
Yeah, I think at the time what was really exciting for me was figuring out how to just be a sponge and learn. And the startup opportunities really allowed me to work with super talented people, be exposed to founders, be exposed to the realm of raising money and capital and going through the process of growing a startup, the ups and downs of kind of like having a user base and seeing the growth of that user base and the competition, especially working on mobile apps. So I was really excited to have two different startup experiences. One within a social networking app, it's actually called Timehop. They're based in New York city. Timehop is essentially a product that allows you to see your photos from your past. You can imagine that at the time they were one of the first to kind of bring this social nostalgia to light, which we all know and love within other social products today that maybe show you pictures from today and years past. But it was really exciting to help that team and work with them.
And then when I had the opportunity to work abroad and get a front row seat to the Start-up Nation in Israel, I was excited to work for Glide, which was a video messaging startup. And that video messaging startup was excellent in sort of seeing how video technologies could scale. There was a lot of real time video improvements and technology breakthroughs that were happening within that product. It was exciting for me to get access to working in sort of like an international office and getting access to sort of like growing a product that was having a footprint on so many people's lives across the world. Those startup experiences were really awesome for me. And like I said, it was all about being a sponge, like wearing many hats and getting close to people who were living and breathing and thinking the sort of next wave ideas.
Holly:
Mm-hmm. So what was the role of research and experimentation in the startups back then?
Sam Haveson:
Yeah. Research and experimentation within those environments I think was a lot more scrappy because you had to try to do what you could with bootstrapping resources. But what I really appreciated was the immersion with customers, getting close to customers, doing routine focus groups, user testing, being able to sort of understand how people were using the product in their daily lives, what some of the use cases were and how could we zero in on the use cases that we hadn't been paying attention to. And so, those things were really key for me. At the time, I was really excited to get closer to young customers, people who were developing their identities and using some of these types of apps to stay connected to family, friends, and their communities. And so I really dove into understanding the user needs and the customer needs of some of those segments and how we could build better products for them, whether they would be social oriented or connection and messaging oriented. And that became something that was really illuminating for me.
Holly:
Yeah. Cool. I like the scrappy startup stage of user research. It's definitely something that's been a big part of my career. And I just like to hear about how it happens at other places.
Sam Haveson:
Totally. Yeah. Doing user research in those environments changed a lot to doing user research and experimentation in larger scale environments like Amazon or Twitter. The nuances and differences there are certainly key, but each has their own element of value. I think at the end of the day, it's really about getting closest to the customer. Being a product leader is being the voice of the customer to your team, to the company, to the business, being able to represent that customer. And so the environment or the lab, if you will, might change depending on the scale or size of the company, but the intention to get closest to the customer and their needs has remained true throughout my career.
Holly:
Mm-hmm. Cool. So you mentioned that that environment is different at Amazon and Twitter. So tell us a little bit about that. What does experimentation look like at Amazon?
Sam Haveson:
Sure. So experimentation at Amazon, it was really, really interesting. I think broadly there's an enormous respect in talent for the user research, data analytics, and business intelligence teams at Amazon. At any point in time, products are being tested. Routinely, they're being dogfooded. They're going through various stages of betas. And one thing I observed actually about doing product management at Amazon and the user research and data analysis portion was that they relied really heavily on what is called a CSAT score, a customer satisfaction score. And at the time, we were building products for an Alexa enabled devices, specifically bringing photo experiences to life in Alexa-enabled devices. So you can imagine if you uploaded your photos, Holly, being up to ask Alexa, "Alexa, show me photos of Holly in Paris." And we were kind of lighting up these voice experiences and understanding what customers would want and what they'd want to ask Alexa to see.
And so at the time, customer satisfaction scores or the CSAT score was a really great indicator for us to know how we were making progress. we went through a lot of routine betas, and those betas are obviously opt in. So a customer can have sort of like a revealed preference that comes through the usage of a beta, which might be different from a stated preference that might come through the form of a survey. So beta testing is really great for you to understand revealed preferences. And what we were looking for at the time at the end of all of that sort of routine beta testing was for them to fill out forms that told us ultimately what was their customer satisfaction.
And those CSAT scores week over week, we were looking to just marginally improve. How can we get those CSAT scores to a better place? So at the time, I think doing user testing and experimentation at Amazon, I learned a lot about the value of a customer satisfaction score. In other companies they might use things like net promoter scores, which is very similar in sort of assessing what is the satisfaction of using this experience? Behavioral analysis has actually been much more illuminated in my current role at Twitter. And I'm happy to talk about that in more depth.
Holly:
Yeah, sure. Tell us more about that.
Sam Haveson:
Tsure. So at Twitter, we are really fortunate to have the ability to, I call it drinking from the fire hose. At any point in time, you can see feedback about the service real time within tweets, right? This is a huge customer benefit for people to feel that their voices can be heard. Whether they're sharing their thoughts and feelings about a product or a brand, tons of companies rely on looking at those tweets in that sentiment analysis. But imagine being on the inside, right? Imagine being on the product team and actually drinking from the fire hose that is tweets from your customers about the experiences you're building. So you have a really great and sort of wide swath of customer feedback that you can sift from directly on your platform and directly on your service. And so the user research teams at Twitter do a lot with doing that sort of analysis and understanding about sentiment. Whenever customers are tweeting about their experience on Twitter, new features we're bringing to market, new services we're launching, we're looking actively about how they're responding and reacting to it.
But in addition to that, we also have a great framework for doing qualitative understanding of how they're experiencing those new products and services. We bring customers in, we do qualitative studies with them where we're assessing their customer needs against these perceived benefits we want to bring to market. Those qualitative studies are really illuminating for the product teams to understand, "Okay, what is the customer feeling about this experience? What are their needs? How do we get to the heart of what they want from our service?" And then we can actually take that a step further through behavioral analysis in which our data science teams have been really great in allowing us to set up environments to understand causal effects of changes that we bring to market.
So it's not just enough for us to observe, let's say in a beta environment, how someone might be using something. That certainly is a tactic that we employ and it's been one that has brought me a lot of value and understanding qualitatively how is someone using something, what is their adoption, and what is their sort of behavioral changes through usage? A beta's great for that. But we also set up testing environments, specifically A/B testing experimentation at Twitter is really robust. And we apply a lot of rigor to how we set up those A/B tests. We have awesome frameworks in-house that allow us to understand causality. So like introducing this change to a set of customers, how do we causally drive different changes in their behavior?
Let's say, for example, we want to drive tweeting behavior. We would like people to create more, to share more, to have their voice be heard. How are these new products we're bringing to market causally affecting them when it comes to tweeting? And you can imagine this for every paradigm of consuming and engaging. But our experimentation frameworks allow us to do a lot of really great causal inference. And so I've been privileged, I would say, to have this sort of beta environment, qualitative environment, quantitative environment, to just get to the heart of sort of behavioral analysis when we're building products at Twitter.
Holly:
Yeah. That sounds really awesome. So I'm curious to hear more, like, do you have any specific stories that you can share about a product or feature that you were part of launching and what kind of analysis results you found?
Sam Haveson:
Absolutely. I think what's really sort of like an interesting story is an experience that we brought to market in 2019, which I was super excited to be able to drive product on. I was sort of our product lead at that time on this experience. What it really was, Holly, was taking a look at one of the most integral product experiences on Twitter, which is retweeting, right?
Holly:
Mm-hmm.
Sam Haveson:
And retweeting has been so embedded in the product since its inception. It's this idea of sort of rebroadcasting, showing affinity for. It's so well known and kind of true to the DNA of the product. But one thing we identified with retweeting was that there was a leveled up mechanic of retweeting, which was called quote tweeting. You could actually insert a commentary. You could share a thought in addition to when you shared something and you wanted to make sure that your audience understood how you felt about it or how you were reacting to it. For the longest time, the quote tweeting mechanic, which again built off of the retweet mechanic, was only limited to text. And the inception of Twitter in its sort of earliest days, we understand why that was true, constraints breed enormous creativity, and sort of 140 characters was this constraint that you could use for text. So a lot of the service was really built around text based sharing.
But when I analyzed and understood a little bit more about customer needs on the service, it became very clear to me that people were looking to express themselves with more than just text. And specifically, we were seeing a lot of trends at the time taking off with GIF sharing and photo and video sharing for people to express themselves. And so the product opportunity that I really got to assess, and I worked with a lot of customers and I understood some of the sort of work arounds that they were doing in light of only being able to share a quote tweet with a comment that was text based or a retweet with a comment that was text based, so they actually wanted more.
Sometimes what they were actually seeking to express probably would've been better with something from their camera role or a GIF that's a lot more reactionary. And so we were exploring what we call help wanted signs. Help wanted signs are ways that customers hack your product to achieve the output that they want on Twitter. Help wanted signs or hacks have actually bride a lot of great innovation in terms of customers being like, "Okay, this constraint exists. How can I push it to get the output that I want?" And we observed a lot of people in terms of sharing quote tweets with text that they would share the commentary and then they would say "Insert GIF here" or, "Add photo here." And while they couldn't do that at the time, we were able to assess that they had wanted that.
So the challenge was for us to think about what it would be like for us to expand on this limitation to give customers this ability to quote through with media, to express themselves in a much richer context. And so I was really excited to kind of go through that process of designing and building that experience. As you might imagine, the tweet, the primitive of a tweet, in the timeline has its own kind of placement in real estate. And we want people when they're going through the timeline to be able to move through the timeline pretty fast and we want them to get the information they need quickly. But as you can imagine, opening a tweet that's a lot larger or that has more tall, visible real estate because new media is being added to it, we had to really design that in a thoughtful way.
How can we do this and make the tweet still feel compact, but also give customers this opportunity to share the media that they care about? And so it was an awesome design challenge. We went through that, and I was working with a really great team. We tested it. Through testing, as I shared, the experimentation framework that we introduced, we were able to see causality. We were able to see that this really made a difference in people's tweeting behavior. And so we were excited to bring that to market. And now today you wouldn't actually tell the difference that you can just share a quote tweet with a GIF or a photo or a video. It just feels like rich expression that's always been there. But there was a time when you could only do it with text.
So that's what I like to sort of draw on in a way of kind of understanding how is my customer trying to achieve something? What workarounds are they employing? And then how do we productize those workarounds and give them more? I love using this example because it seems simplistic, but it had its own unique nuance to bring to market.
Holly:
Yeah. So one of the things that I'm curious about, because I talk to so many people who are in much smaller companies that don't have the analytical capabilities that a company like Twitter has, I'm just curious, what does it actually look like to be reading the results of the sentiment analysis? How do you find the help wanted signs? Do you go to a team and ask the team to analyze the data? What does that look like?
Sam Haveson:
Yeah. So I think with any wealth of data, if you have a large sample size, you always need to be mindful about noise. It's not always that a larger sample or a greater sample is going to give you sort of a more evidence or more conviction. So I always sort of caveat that while it's incredible to have a wealth and bank of data and information, sometimes that you have to be really clear about signal to noise ratios. Like where am I getting signal versus where am I getting noise? And that's really like the artfulness of doing product management and being able to key-in on what is this customer need that we're observing or seeing or dissecting through research. The behavioral data, is that happening at scale? Are there trends? Are there other sort of cohorts exhibiting that behavior?
And then you start getting to sort of the heart of it, which is, is this something that we could eliminate and bring to life and improve on our surface areas and will it have enough impact to other cohorts? So it usually starts with trying to find a smaller sample or a smaller set of customers that behave similarly. And we would call that sort of like a cohort that has the same sort of behavior, whether that's usage behavior, demographic behavior, et cetera, and trying to figure out if we make their lives better. Could that scale to other cohorts, other similar cohorts? It's interesting within your question, which is like, when there's a lot of data and information, how do you sort of parse it? And again, I always repeat that it's important to figure out signal to noise and when you need to do that. And you need to get to the heart of it, try working with a smaller sample size, try working with a smaller cohort and see if that cohort mirrors behavior of other cohorts. Maybe a little bit more nuanced, but you could start there.
It's always important to look at a behavior that might be emergent amongst, let's say a cohort that might be using the platform for years. So we would call those sort of like heavy tweeters, right? These are people who have particular behavior sophistication with using the product and service. And their needs might be wildly different from someone who is sort of like a new tweeter or a light tweeter, someone that might be new to the service and/or maybe doesn't have those usage patterns. And so we always need to think about the cohorts and the frequency of usage and assess that against what we're building and who are we building for. And so that's something that has been really useful.
And just to get at one piece of your question, which was how do you find when people are hacking or doing workarounds, these things come through either organically from research studies that we've conducted, trends that we're dissecting, but a lot of the times it also comes from the product team themselves having experienced that friction and having exhibited that workaround themselves. So it's really interesting being on a product team where people have been using Twitter for decades, people themselves that work on this product love this product. And they have found the friction that some of our customers have brought to us. And so that firsthand experience of understanding where does that friction come from, how do we reduce it, how do we make their lives better, how do we give them more from the service, can oftentimes also come from firsthand experience, from sort of dogfooding your product, using your product and interacting with the community directly.
Holly:
Yeah. Cool. I really love that. And I love what you were saying about the cohorts and the smaller groups. I think that is a consistent message across all company sizes, that we have to look for segments and groups that have things in common that separate them out from just looking at the mass all at once. Because when you do that, it's so noisy.
Sam Haveson:
Absolutely. I think one of the most important things to realize here is that to build something that a small group of people will love will get you to a better place so that a larger group of people can eventually like it and then maybe love it. But niche audiences, you're really trying to figure out, did we get this product right and is this product really meeting the needs of this subset of people and are they loving it? It's sometimes counterintuitive to assume that everybody all at once at mass will appreciate something.
So starting with a tinier audience and a smaller niche cohort and getting to the sweet spot of them loving it can illuminate a lot more findings, such that you can get other cohorts that look like them or have a deviation from them. And you can sort of find those adoption patterns as you scale it. But if you start off the bat with, "We're building for everyone," you actually build for no one. So it's remarkably important to have a key customer in mind, have a key cohort in mind and have like a laboratory or an environment to understand, "We're making progress against this key cohort, we're making progress against their needs." Potentially, their needs may be exhibited at scale by others, but can we start small? Can we start there? And that's something that I have found in my career to be really useful.
Holly:
Yeah. Awesome. There is another thing that you mentioned that I'd love to learn more about, which is the robust A/B testing infrastructure that you have there. How do teams using that... What does it look like? How do they do it without stepping on each other's toes?
Sam Haveson:
Yeah. So I think a lot of what A/B testing is leveraged for or useful for is really just deciphering causality, right? At the end of the day, we're trying to have a population of people in what we would call sort of a control group and a population of people within a treatment group. That treatment group is going to get a set of changes that are different from the control group. What we're looking for is how this treatment group behaves relative to the control sample. So whether it's a step function change in the way that you're consuming, whether it's a step function change in the way you're creating, whether it's a step function change in the way you're browsing, whether it's a step function change in the way you're discovering, any of the nuances you're introducing, you want to hold back a portion of customers who are getting the experience that's in production. So the experience that is baseline.
What we like to do when we infer causality amongst the treatment group is we're really looking for a set of behavioral differences against that control group. So we can better understand, "Hey, is this step function change driving new behavior? Is it driving the outcomes we cared about?" And we're looking for the causality of that against the group of people who didn't get that change. And so when you set up a lab or environment like this where you're doing A/B testing, it's not as much about seeing sort of like, just click through behavior or if you're driving conversions. Those things can actually be assessed offline. These A/B frameworks that we use here is all about causality. What is being measured differently and exhibited differently in the treatment group relative to the control group? And how are these people reacting when they get a new change?
And so sort of thinking about data analysis and causality has allowed me to really understand, "Hey, is this thing we're introducing driving a statistical difference in their behavior? Maybe it's driving a difference in behavior, but is it statistical?" And looking at that, we have enormous talented data science teams who are evaluating the nature of that statistical change, the nature of the P values. So we're using a lot of quantitative methods to get to the heart of, "Hey, did we costly drive a beha... Or we sought out to change? I think that that framework paired with long term time horizon analysis, so once you've assumed that you've proven a hypothesis to be true or false, you can take that out of the laboratory environment, you can give it to a wider set of customers, but you can actually do something that we call hold backs.
And so at Twitter, hold backs are employed. I think a lot of tech companies use hold backs, which basically means we're giving this to mostly everyone. We think this is an experience that's going to benefit the mass user base. We're going to hold back a certain percentage just to watch long term causality take place over a longer time horizon, whether that's six months or year. And so while you might A/B test on the outset, maybe part of your launch strategy is continuing to hold back a portion of people so that you can continue costly inferring, how is this going to make an impact six months from now, a year from now on the user base and the service? So those are some of the tactics that we sort of employ.
Holly:
And what kind of percentages end up being put in the hold back group?
Sam Haveson:
Oh, it changes all the time. That can vary. But what you're ultimately looking for with a hold back is that a majority of customers get the experience. So 95%, 98%. So you're really looking to hold back probably less than 5% at any time, because as you can imagine, network effects are really real. And network effects on a Twitter, a product, is that people being able to create and consume and get the benefits of an experience. You want to make sure that you're not withholding people from participating entirely within those network effects. So you want to make sure that the percentage of people that maybe are withheld are smaller and that substantially there isn't like a huge deviation in their experience, that they can still get the value. In any case, we usually release consumption experiences to everyone. And maybe we just hold back a percentage of people from creating or being able to sort of utilize it, a tool or a capability, in the context of the teams that I work on.
Holly:
Awesome. We've talked so much about research and experimentation, but I'm curious to hear who are the main people on the teams there that are involved in that? Obviously, product is involved in it, but what is the role of design and UX research and what are the different roles that are involved there?
Sam Haveson:
Absolutely. So Twitter is a world class team. I'm really fortunate in my career to have worked with some of the best product designers, user researchers, data scientists, software engineers, engineering managers. All across the gamut, these cross-functional peers have been really instrumental in the product team. But we also have cross-functional leadership with regards to product trust, legal counsel, product marketing, brand development. So these teams have really, really incredible folks and talent that are working on them. To answer your question about who is involved with the product development process, it's a really sort of like flat structure in the sense that everybody has a hand in the process. Everyone feels like an owner in the process. At least on the product teams that I have worked on, I have tried to create environments where people do feel like owners and everybody has the skin in the game in the outcomes of the scenarios with their customers. So everybody's on the ground learning and developing.
I think one thing that has been really nice sort of over the last year is the nature of being able to build in public and to interact directly with the communities and experiences that we're putting out there. There's been a lot of really interesting and new research methodologies we've employed to get closer to the communities that we serve. So one example of that is one of the offerings that was released called audio spaces. This is a product experience where you can host a real time audio conversation. You can do this with up to 10 speakers, to co-hosts an unlimited number of guests. And the benefit of this spaces environment is that you can really host a discussion. Our product team has been very fortunate over the last few months, as we've been experimenting with features, to host spaces directly with our community and having folks from the product team speak, discuss, and learn directly from our customers.
So within those spaces, the user researchers may help facilitate it. Designers are there, engineers are there. Myself as a product manager is there. And so we're all getting closer to the customer and asking them questions directly. And we're doing this actually using a tool on our service. So again, drinking from the fire hose that is launching products to a user base that can be vocal and have a touchpoint with you at any moment is really a privilege. So I'm really excited that everybody takes a grand step forward. When we are doing these types of learning exercises, when we are getting closer to the customer, it feels very flat in the sense that the cross functional teams are participating.
Holly:
Great. So it sounds like you've really got more than the product trio. You've got product design and engineering, but you've also got UX research, you might have data analysis and other roles as well that are all listening and learning along with you.
Sam Haveson:
Absolutely. I think great product development is really about empathy and being empathetic to the customer, being empathetic to the experiences that they're having. You can only really build great products when you are empathetic to the people you're building those products for. And so what we try to do is make sure that everybody is being given the opportunity to have empathy. And so the closer you are to the customer, the closer you are to the research, the closer you are to understanding needs, the more empowered you're going to be when you're building. So we think everybody should be a part of that process. I think it's really important. And it's fluid, right? Because at any point in time, whether it's sort of in the earliest stages of product development, through to public experimentation, through to the launch, and then the iteration cycles, you need everybody to be plugged in at all of those stages. You don't want to plug people in at, let's say part three of that stage. It's really important to start from the onset. So from that whole journey, I really appreciate making sure that everybody on the team is involved.
Holly:
So it sounds like the team must spend a lot of time together?
Sam Haveson:
Team does spend a lot of time together. We're very fortunate at Twitter to have a very kind of decentralized workforce in the sense that people are working from all over the world, all over the country. We're remote in the sense that people can do their best work of their lives if they wish to remotely. They also have a choice to be hybrid,. To go into the office sometimes or to work remote sometimes. And so there's a lot of flexibility that has led teams to change how they work.
But one thing that has remained really sort of singular in the last few years and the trends that have changed with how we do our work is that people stay really close to the product teams that they're working on. There's a lot of interaction and opportunities for people to fill close together, have a lot of tools that facilitate that communication, whether it's Google Drives to carve out documents and presentations, or Slack to communicate daily to the meetings that we host to build cadence and to build fluidity in how we make progress. So these are all great things that keep the team feeling close.
Holly:
Yeah. Tell me a little more about the meetings that you host. Are they sort of the standard? Does your team follow scrum or agile principles? What kind of checkpoints do you have that keep the team together?
Sam Haveson:
Yeah, I mean, I think product teams change depending on the life cycle of the product that that team is working on. And so broadly speaking, the experiences that I've worked on and delivered, they've changed. So whether it's an experience where we have like a full stack team, and so you have mobile developers, you have backend developers, and you're really trying to make sure that experience is end to end, agile works really great. We find that is an environment that we like to adopt. Maybe sometimes the change is just on the client and it's maybe just a mobile change and you might employ scrum. So it changes a little bit. But broadly speaking, I would say agile development is what we've employed and what I have found to be best.
Thinking about meetings and how to bring people together, at the end of the day, people are looking for the information that they need to do the best work of their lives. And so making sure that people have a venue to share information, making sure that people are abreast of information, and that there's really this flow of knowledge is the end goal for me. When I'm hosting meetings, it's really about information flow. Otherwise, there's other tactics you can employ. So that's one thing that I try to do with cadences that we set up in making sure that people feel really informed, involved and can discuss blockers, can discuss things that they're facing, any issues, flag them, field them, and be able to do the best work of their lives.
Holly:
I like what you said about information flow. That's one of the ways that I often think about the communication structures that we put in place and the sort of scaffolding for communication is, how do we support that information flow?
Sam Haveson:
Yeah. Information flow is different for every person, right? Some people might be best poised to receive information visually. And some people might be best poised to receive information in an auditory way, right? They might need to listen to it or hear it. And others might actually prefer getting information through documentation, through reading and maybe in sort of like a silo. And so also figuring out what is the preference of each person you work with and you collaborate with and how do they best absorb information is key because we all absorb information differently. And being able to sort of tune how you do that knowledge transfer based on the preference of the person, that's something that I'm continually looking at evolving and making sure is something that I'm mindful of. I myself am actually best with visual, so I appreciate information flow and knowledge transfer in a visual manner. But that changes for every single person, right?
Holly:
Yeah. It's interesting that you mentioned that. I know some leaders like Teresa Torres are really big on visualizing things because it externalizes your thinking. And I value that greatly, but my own personal style is more likely to be reading than visual or auditory. So for me, I have to remember, "Oh, other people like it other ways."
Sam Haveson:
Totally. Totally. And that's the beauty of humanity, is that we all tend to absorb and receive information differently. But when we recognize how we can work with others and their preferences, it can make things remarkably successful. This is the case also when you're working with customers and you're doing customer discovery. How does the person on the other end that you're speaking to want to engage with this concept that maybe you're trying to suss out? Is it something that they need to see and play with? Or can they just hear about the abstract, right? And so, how you learn and do customer discovery is also not just about making sure you're placing information in front of a customer the same way. You might need to employ different tactics. So this goes for collaboration within the product team that you're building on, in addition to how you're presenting information and concepts to customers. Being able to tune and understand when they're being responsive and when they're rocking it and when they're getting it relative to when you might need to try something else.
Holly:
Yeah. It's really important to just meet people where they are. I think whether it's talking to customers or talking to other team members and colleagues, that's one of the things that I think sometimes I find myself advising earlier career people to practice talking different methods for different people and trying to meet them where they are.
So changing gears a little bit, one of the things you said earlier that piqued my interest was, after your time at Amazon you decided you wanted to go somewhere smaller. Why did you want that?
Sam Haveson:
Yeah. So Amazon is an incredible company. They have some of the best business acumen in the industry and some of the best technology scaled in the industry, but that's also a direct result of the total workforce that's there. I mean, it's a really, really large company. Each team within the large company definitely functions in a smaller environment. And it's kind of operating in more of a business unit if you will. And there's a really great thesis that actually came out of Amazon called two-pizza teams, which is that the best teams typically can get work done and the size of them should be reflected by, can they be fed with two pizzas? So I did see that a lot at Amazon taking a really large company and then trying to figure out environments for people to work in smaller, more agile settings respected to the business unit or product line that they were working on.
Remember, Amazon is so many different services, from Prime, to Alexa, to Amazon Music, to AWS. These have all been just such incredible businesses. So I was just super pleased to be able to work there, learn, and grow and develop. But it wasn't lost on me that it was a really, really large company, right? And it was matrixed in that sense. What I was looking for was to find a company that I could work at which was also a product that I loved and try to do the best work of my life in sort of a smaller setting. So Twitter, at the time when I joined and had this opportunity to join that product team, it was a lot smaller. The cross functional experience team, which is comprised of engineering, design, research, data science, product, it was so much smaller than it was at Amazon.
While Twitter had a really global footprint and amazing user base and just enormous traction in the industry having been already public at that time, it sort of was functioning as a great company, it still was small in the retrospect of the product development teams. We've grown a lot over the last few years. I'm coming up on almost four years at Twitter, which has been such a privilege. But we've grown an enormous amount in the last couple years. But certainly, even at the size that we're at now, we're still not the scale or size of some of our peers. And certainly we're not as large as a company like Amazon.
Holly:
Are you able to put some numbers on that? How many engineers and product people are there right now?
Sam Haveson:
I would say we're roughly around 5,000. That's not the total size of the company. That company is probably closer to a little over 10K and we're growing, but it's remarkably exciting to know that we have the ability to drive impact, to work on a service that impacts so many people whether it's from politics, to entertainment, to pop culture, to journalism, and thinking about the fact that a lot of the people that come work here are real owners. And even as we scale and we grow, there still is that sense of it being smaller, it feeling smaller than a lot of our peers. I mean, if you look at Amazon, it's a 100,000 plus, at Facebook, 60,000 plus, et cetera. So that's sort of what I'm talking about with regards to scale of company, but also being in an environment where it has a global footprint, it has a strong brand, and it has a really devoted user base that you can continue developing and growing.
Holly:
Yeah, that sounds like a lot of fun. So I think we're coming close to the end of our time. I like to ask people what advice they have for aspiring products leaders. So if you could tell them just one thing, what would you tell them?
Sam Haveson:
That's a great question. So in addition to being a senior product leader at Twitter, I'm also the co-chair of the Women in Product group here at Twitter. I care an enormous amount about developing and growing female talent into product. So I'm always passionate about that line of work. I've done a lot in my career to mentor and make sure that we are growing the total number of women in product management and making sure that we have a voice and can drive impact because we serve global users. And so we should make sure that we're imbuing the knowledge we have within the products that we work on. And no better people to do that than women. So I care a lot about diversity and inclusion.
In addition to the Women in Product group, I'm also an advisor and mentor to graduate students at Cornell Tech, which is the MBA program that I participated in. I actively help develop them and give them advice and guidance in how to break into product management and be successful and how to take the skills that they have and harness them into the teams that they may be joining.
One of the biggest things that I have always reflected on is that authenticity is a really, really important ingredient in the recipe of doing great work. If you're authentic about who you are, if you're authentic about the process, if you're authentic about the problems, that can be a world of difference. We tend to try to maybe emulate or meme other product leaders. And that might be helpful in early stages of careers, but actually leaning into your own authenticity, figuring out which bend you sort of contort to on the product spectrum. Because again, we have very technical product leaders, we have very design-oriented product leaders, we have very business focused product leaders, being authentic with where you are on that spectrum is super important and leaning into that strength.
And so I tell people, "Be the best version of you. Be remarkably authentic. And stay true to listening to yourself as you develop within your career," because where you sit on that spectrum in the first five years, it might be different in the last 10 years, it might be different in the last 20 years. So being open to seeing kind of how you grow and develop is important, but authenticity is something that should remain true. There's only one version of you and you're probably going to be able to do the best work of your life if you're staying true to that person.
Holly:
I love it. That's fantastic. So where can people find you if they want to follow you?
Sam Haveson:
Well, as you might have guessed, you can definitely follow me on Twitter. You can also follow me on Instagram, on LinkedIn. I will be able to share all of my handles. Hopefully that's something that we can post or that people will have access to. The best place to find me though really is on Twitter. I'm an open book. I would love to connect with other leaders in the space, collaborators. I'm open to being a resource to any folks who want to go deeper on any of the topics that I talked about today.
Holly:
Wonderful. We will put the handles in the show notes, but just for the sake of listeners, what is your Twitter handle?
Sam Haveson:
Absolutely. So I think one thing for everybody to know who has a Twitter handle, that's something that has been true to you and maybe it's something you created a long time ago. For me, over a decade ago. So my handle is @samhaves, which is my full name but just not the last two letters. So samhaves, @samhaves, @S-A-M-H-A-V-E-S, samhaves. And you can find me on Twitter.
Holly:
Wonderful. Well, thank you so much. This was a really fun conversation, Sam.
Sam Haveson:
I'm so happy that we were able to spend time today, Holly, and chat. Thank you for having me. I look forward to talking again in the future.
Holly:
Yeah, me too.