Brian Kardell: Okay, hi, I'm Brian Kardell. I'm a developer advocate at Igalia.
Eric Meyer: And I'm Eric Meyer, also a developer advocate at Igalia. One of the things that Brian and I talk about a lot, not just on Igalia chats, but in general, is that we don't really know what people want on the web or what the web is even doing, and so we thought we'd talk to somebody who tries in their own way to fill in those gaps. So, guest, please introduce yourself.
Sacha Greif: Hey. My name is Sacha Greif, and some of you listening might know me as the person who runs the state-of surveys, State of JS, State of CSS, State of HTML, as well as a couple other surveys now. Yeah, I've done other things before that, but lately it's been my main focus. So I wouldn't say I know what people want on the web, but at least I'm certainly trying to figure it out. And yeah, hopefully we can talk about that today.
Brian Kardell: I like the story of how you got started on this, so I wonder, if you don't feel like it's told out or anything, would you mind telling people how this started?
Sacha Greif: Sure. Well, what happened is... So the first State of JS survey started in 2016, so almost 10 years ago. And at the time, I was using a framework called MeteorJS, which not many people remember these days because things move pretty fast. But the whole selling point of Meteor was that it was all in one environment. So you had your database, your front end, your back end, everything was integrated, which was awesome in terms of productivity, but the downside was that once I came out from that ecosystem, once I ventured into the wider web development world, I felt very lost because now I had to piece together all these different bricks to form a whole stack. And so I could find usage stats, like NPM downloads, or maybe even GitHub Stars, but it was very hard to find data about what people actually liked. Because you could have millions of downloads for backbone JS, because at the time, it was one of the main JavaScript frameworks, but that wouldn't tell you that it was actually on the way out and that React, Vue, Angular was, we're going to replace it. So I wanted something that looked beyond just raw data or raw download counts, and that's how I got the idea for that first diversity of JS survey. I wanted to ask people not only what they are using now, but how happy they are with whatever library they're coding with, and from that, try to figure out the direction of the JavaScript ecosystem at first, and then from that, I expanded to CSS to figure out things like which CSS features were up and coming and worth learning about. Also, state of HTML, which is more generally about... Basically any area of the web platform as a whole that catches my interest or that I feel I need to learn more about personally, I figure other people will have the same inquiries, and it makes a good candidate for a survey.
Brian Kardell: About how many people do you get to respond, and has it changed dramatically over the last 10 years, like the size, the sample size?
Sacha Greif: So what happened is when we only had set of JSON, set of CSS, I feel like most of the respondents were concentrated on those two. So the most I ever had, I think, was maybe 20,000 respondents to a single surveys. These days, because there are more surveys, I feel people kind of gravitate towards the one that is most interesting to them, so it might be between 5,000, 10,000 depending on the survey. But it's growing at its own pace, and I feel as long as you get a couple thousand, that's already enough to have a good data set and be able to derive some interesting insights.
Eric Meyer: What's a good example of an interesting insight?
Sacha Greif: Well, yeah, people always ask me that, and it's very tricky to answer because the danger is that you will find insights that aren't really insights, and what I mean by that is you can find correlations that aren't really borne out by the data. So one thing I like to talk about is how I started asking people about their job title, and one insight in quotes that I found and I was very proud of myself was that job titles with the word developer in them make way less money than the ones with engineer in them. So that's an interesting insight. Does that mean that people can change their job title and start making more? Well, probably not because what's actually happening, as far as I can tell, is that larger companies tend to hire for front-end engineers and smaller companies tend to hire for front-end developers, not because the job descriptions are that different, but more because the market is just segmented that way. And larger companies tend to pay more, and smaller companies tend to pay less, and that maybe explains the discrepancy and income per job title. But even that, that's not something I can definitely prove in any way. More research would be needed. So on one hand, that's kind of a cool insight, but it's also not anything that I can defend on a scientific level. It's more a guess or intuition, I would say.
Eric Meyer: Yeah, that's the thing about surveys, voluntary surveys I guess, basically, is that you're limited to who answers, and is that fully representative? Good question. It's different than a national census, for example. That's the thing that I think people who do surveys always have to remind people is that this is a snapshot of people who answered.
Brian Kardell: I think there's also sort of what is the purpose of why we're asking this question. One thing is to ask questions to learn which questions we want to study more. So I think this is a really interesting insight that you came up with, and you can create a hypothesis, and somebody could study that more and find out. Maybe there's truth to it. Maybe that's exactly... Is this correlation? Is it causation? Is it something you can defend? Do you have a stated goal, or is it to study as much as we can with as much sort of diversity and not try to claim that it's representative necessarily of the web at large?
Sacha Greif: I think the main goal, at least, because there's multiple secondary goals as well, but the main one would be to help people decide how to spend their time when it comes to learning new technologies in web development, whether that's which front-end framework to use, which CSS feature to learn, even things like how much should they invest into AI these days. And I think from that point of view, the surveys do actually do a pretty good job because they haven't really been wrong. There hasn't really been a case where in, whatever, in 2018 I said that React was very popular, and then these days nobody talks about React. That's not what happened. React is still popular. Talking about CSS, the has selector is clearly very popular in the surveys, and that's born out in the real world. It's not something that's out of the blue and not represented in reality. So I think in terms of the really basic goal of the survey, which is maybe not even finding new insights, but just confirming trends, because that's also useful, sometimes everybody is talking about technology X, Y, Z, but then you'll have people who will dismiss it, 'Oh, that's all hype. It's all paid marketing by such and such company.' And the survey can be like, 'No, actually, that's actually what people think. Developers do enjoy using that feature, that framework,' so on. So sometimes it's not about finding these hidden insights, which I've fallen into that trap, so that's why the previous question I thought was important to touch on. Because sometimes as a survey runner, I want to be like, 'Hey, look at this thing I found out. Look at this thing I discovered, that actually engineers make more than developers,' or 'People in whichever country use this more than that.' That's what sells, in a way, because that's the things that you can mention on a podcast that make you sound good. But I think that the core purpose of the survey is actually just to confirm the trends that we all know about, but for... Well, first of all, confirming is valuable, and also sometimes people don't actually know about those trends because they might be new to the ecosystem. So if you're just learning about CSS, you might not know that grid is actually not that new anymore, but it's worth learning about. Flexbox is also worth learning about, has, aspect ratio, because there are like 200 new CSS features, and you can use the help to locate those that are most worth knowing about first.
Brian Kardell: Yeah. Actually, I am a big fan of trying to get data from the web, and it's really hard. We have a few different sources that Eric and I look at and have conversations about. There's the kind of Chrome status metrics that are available. There's some RUM data that's available. There's a Firefox dashboard about user activity and things like that. But one of the things that is... First of all, those all have their own biases, which would be interesting to come back and talk a little bit when we're done with this topic, about if there are biases and what do we do to help get past them. But the thing that would be really difficult to unpack from those data sources is trends, really. It's trends because the data from, say, the HP archive is millions and millions and millions of sites, and those millions and millions of sites, they're not necessarily being built every week and updated every week or every year, even, necessarily. Some of them are largely legacy. So if you see, for example, huge, huge, huge amounts of that still use jQuery. And I don't think that people are actively going out and choosing jQuery. They're just sitting there working perfectly fine, but nobody's actively choosing them. So I think the survey data is more timely and it offers a different view of similar data, as you're saying, to develop.
Sacha Greif: Yeah. On the topic of resources to help you keep up with features and the platform as a whole, I actually wrote a post on CSS tricks categorizing all the resources I could find. And even as someone who's supposed to be keeping up with all those things, I was surprised at how many different sites there were. Google by itself has two or three different indexes and directories of statistics and features and stuff, and of course, there's MVN, there's the surveys. So, yeah, it can be a lot, especially for new developers are thrown into that world. But that's also why I tried for the surveys themselves to be a jumping-off point to those other resources. You guys are familiar with baseline, of course, so I've been trying to add the baseline indicator in the survey. So I try to include pathways to other things that can help you learn about the web and not just the surveys themselves. I was very proud of the name State of JS. I thought, 'Oh, that's catchy. That's very authoritative,' when I first launched it. But I realized the downside is people will be like, 'Well, you're trying to speak for the whole web and you're trying to say that's the way things are,' and of course, that's not the case. So over the years, even though I've kept the name because it's kind of what it is now, I have tried to be like, 'Hey, this is one survey, but there are other surveys, there are other indexes, there are other stats, there are other people, and I want to make it clear that this is not a self-contained thing and that's really a group effort.' The surveys themselves are a group effort, but also just the whole web platform.
Brian Kardell: Yeah, we also, I think, ask questions in the state-of surveys about what you want that you don't have yet, right? There was questions, I believe, for several years about container queries always at the top of every survey that we ever asked and has before container queries, and then I think it was number two for a long time. And I don't know, maybe now that we have them, to be honest, I think has might be number one in practice in terms of how much people use it and like it. It's always hard to prognosticate those things, but it's tricky. How do you think we should use the data that we collect here to shape priorities and things? Like Eric and I are on the Interop committee, and that is us getting together and trying to discuss, which way should we prioritize and get everybody on the same page? Which things should we really focus on?
Sacha Greif: Yeah, I think definitely the state-of surveys can be one factor among many that are considered for things like Interop. I think maybe one aspect where the surveys can stand out is we can actually ask people... Not just look at usage data, but actually ask people what they want. So that's a bit different, I think, from looking at Chrome stats or Firefox stats or whatever. And we can even ask them about things like what's missing from the language, and what's missing from CSS or JavaScript or HTML so that things that go beyond just, 'Okay, what do you think of this or that feature?' So, yeah, the fact that we have more freedom, I think, in the context of a survey makes it pretty interesting as a tool for helping browser vendors decide what to focus on. And I've especially been working with Google on that aspect because they finance part of the surveys, the Google Chrome team basically, but I also have contacts with other browser vendors and I tried to make it clear that the survey is not affiliated with one party or another. It's something that's, I think, in the end function more or less as an independent entity.
Brian Kardell: You mentioned Google sponsoring some of the work. I think on the page you have some other, Frontend Masters, GitNation. Are these affiliations, or do they help sponsor it? Is it a collective? I'm kind of curious, because I don't actually know... You would think that I would know the answer to this, but I don't know the answer. So how does it work?
Sacha Greif: Yeah, well, I've tried different monetization strategies for the surveys. We sell T-shirts, which I'm wearing right now. At one point I tried to let people sponsor the charts. I thought that was a cool idea, where people could buy a coffee, pay $5, whatever, and then they have their name next to a chart, like the Adopt-a-Highway system or whatever it is in the US-
Brian Kardell: I like it. Yeah.
Sacha Greif: ... so you can adopt a chart. But that was a bit cumbersome to maintain. You had to have a payment processor update the site every time. One thing I've never really wanted to do is charge for access to the data because I feel people gave me their data for free and they spend their time filling out the survey for free, so I wouldn't want to make the end result paid. So I guess what's left is just sponsorships, having companies partner up with the surveys and sponsor them directly. And so that's what Google and these other partners do. Practically speaking, so Google is the largest sponsor and they are involved in the survey design process. And again, that doesn't mean that the surveys are going to be favorable to Chrome over another browser. What it means is that if they want to figure out something, and I think it's a good question and it makes sense to ask that, I might include it, or I might say no. I still have complete editorial independence. And also, the whole survey design process is transparent. On GitHub, we have get public GitHub threads. So that's Google. And then I have other partners, which they are not browser vendors, so they are less involved in the actual survey design process. But for example, for Frontend Masters, they have links to their courses in the survey results, sponsored links, also because those are pretty high-quality courses, so I don't feel bad about including them. I'm pretty confident that people are going to get value from them if they want to learn about those topics.
Brian Kardell: Yeah, it seems to tie to what you say is the kind of goal of the survey, so you can help people connect with training. I agree. That makes a lot of sense.
Sacha Greif: Yeah. And then other partners, so GitNation, they put on events, which I have spoken at before, but they also support the surveys. We have Algolia, which... I'm not going to list all of them because it'll turn into an advertisement, but basically, yeah, there are a bunch of companies sponsoring the surveys because that was the most, I guess, ethical way I could find off just making a living. But I definitely try hard to not let that affect the end result. And I'm not aware of anybody having a problem with that so far. I'm very open to criticism. We can talk about that as well, because I have received my share of criticism, but so far none of it has been about how I fund the surveys.
Eric Meyer: What are some of those criticisms?
Sacha Greif: I don't know if I'm weird or something, but my way of responding to criticism is to actually talk about it too much. Maybe it's a defensiveness, but you can find articles, very, very long-winded articles where I talk about those things. Chief among them I would say is the issue of representativeness, mostly in terms of gender diversity. Again, I've written about that extensively, so I don't want to go too in-depth here, unless you want me to. But in a nutshell, the surveys have always skewed very male-dominated, and I think that's because the industry skews male. I think that's fair to say. But then on top of that, the surveys had their own bias for a bunch of reasons, one being that social networks also add their own bias. So if you share the survey on Twitter and Hacker News, you can imagine that that's going to compound the bias that's already inherent in the industry because those platforms tend to be male-dominated, for whatever reason. Maybe they're not welcoming. And then on top of that, I think I haven't been very good about addressing the bias. It's two issues: the bias was there, and then I didn't address it properly. So to give you an example, one question is about, in the State of JavaScript, who do you follow online; which video creators do you follow? And at some points in the past, I've tried to make that, what I call a pretty fine question, where you have a pretty fine list of options. So you can just click who you follow, who you watch on YouTube. And that list didn't include any women creators because it was based on people's answers to previous surveys. So because the surveys had that bias, for whatever reasons, that's how the list ended up being, and that sent the really wrong message about inclusivity because you've confronted with a list of 20 guys. The fact that the list doesn't feature any women is a problem in itself because it sends the wrong message to any women that might be taking the survey. It sends the message, 'Oh, this is not really for you.' And so if the goal really is to have more diversity, well, having a list that doesn't feature any women is counterproductive. I think as engineer, we can be very process-focused, where if you follow all the right steps, the result is going to be good. But in this case, I thought I was following all the right steps, but the result was still bad. For example, subsequent years, I just didn't have that list of options. Instead, it's a free-form text field, which has its own downsides, but at least it's not sending the wrong message as you are taking the survey. So I guess that's an example of a criticism and how I've adapted to it.
Eric Meyer: How did the results change after you went to free-form?
Sacha Greif: Well, they didn't really change that much because the largest streamers, for example, are still men. They have bigger audiences, and even sometimes women's streamers may have male audiences. So it's a very complicated area. I have tried, for example, to get more women streamers to advertise their surveys. In fact, I have financially sponsored those streamers to do so, but their audience might not have that many women either. But like I said, I think it's important to separate the actions you're taking from the message that you want to send.
Eric Meyer: Yeah.
Brian Kardell: Yeah, I don't know. I guess how does it get promoted for the most part? And have you given any thought to also the time that it takes to take it, and have you given any thought to ways to improve that? Could you join a thing where you get two questions a week or something like that as opposed to you have to sit down and take it all at once? I don't know. I'm just curious. What sorts of other things have you batted around that you've struggled with, and how have you maybe tried to deal with them?
Sacha Greif: Yeah, I think you're touching on at least three or four different things that are all worth addressing. So to piggyback on what you just mentioned, one thing I've realized is whenever you have something like filling out the long survey that imposes a tax, a time tax on people, that's also going to skew results in one direction. So it might not be gender-related necessarily, but there's certainly a kind of person that's more able to take 20 minutes or 30 minutes to fill out a survey, people who have comfortable jobs, people who are not afraid of getting fired if their boss is like, 'Hey, what are you doing instead of fixing those bugs,' maybe people who feel confident in their knowledge, who feel like, 'Hey, I have something to contribute to this survey. I know what I'm talking about.' And so I think, yeah, every-
Brian Kardell: Yeah, maybe you miss out on people who have young kids or something because they just can't take the time away to do it or something. There's all kinds of ways it could affect that.
Sacha Greif: Unfortunately, you guys keep churning out new features that I need to ask about. So the surveys keep getting longer, so I don't know if we can have a feature freeze for a couple of years, then the surveys can become more manageable again. I'm thinking about the CSS survey especially. I think there were over 60 features in the last edition, and all of them felt fairly necessary to ask about. They're all things that people actually use and could be game-changers. But yeah, whoever's listening to this, if you can think of a way to make the survey shorter, let me know. The other thing that's going against that is if you make the survey shorter, if you drop some questions one year, you can lose that continuity of data over the years, so you have a gap in your data. And for this year's State of JS, I actually dropped two sections, which makes the survey shorter, but even if I add those things again in future years, it's going to be a bit less clean in terms of looking at the trends. So that's always something to be cognizant about. And you were also asking me, oh, yeah, about outreach to Girls Who Code and so on. So I have tried that, but the results have not been very good because I think those groups are already over-solicited, right? I'm sure they receive dozens of emails about random starters that they're like, 'Oh, hey, our new AI thing is great for women. Can you send a link to your 100,000 members' email list?' So I wouldn't be surprised if they kind of have a mental block on anybody trying to access their audience, and I don't blame them for that. What I think actually worked the best of everything I've tried so far is just launching a new survey. So we launched State of Devs, and that survey is more focused on everything besides technology, so career, workplace. And State of Devs had a 15% rate of women respondents, as opposed to something like 7, 6, 7% for State of JS, maybe 8, 9 for State of CSS. So I think this showed me that changing the questions, changing the survey itself is probably the most effective way to change the audience, maybe things like discrimination, things like career. And so in the future, if I wanted to, let's say, have more students take the surveys, well, maybe I might do a survey like State of Learning Web Development or whatever, instead of trying to reach out to students to fill out the State of CSS surveys, which might take a lot more work. And over time, hopefully the new audience from State of Devs or State of whatever will also percolate into the other surveys and we can build out a more inclusive audience as a whole through that method.
Eric Meyer: You said that some of your questions build on results from previous years. What are some other sources of information, data, whatever you look to in order to try to formulate questions or make adjustments to the surveys?
Sacha Greif: I would say a lot of it is based on my own personal intuition. For example, for the State of JavaScript, there's another site called Best of JS, which also tracks trends mainly based on GitHub Stars that's very useful just for the basic work of finding new libraries to add, tracking trends. And for CSS features, there's all the resources we've mentioned, the web feature index, MDN and so on. But in terms of more fuzzy questions, not just about libraries features, but more about trends, a lot of it is based on my own intuition, I would say. For example, there was this thing last year about JS Sugar, was it, basically standardizing how JavaScript is compiled and maybe building that more into the language, and then there was some pushback. So this gave me the idea this year to try and track these two poles of JavaScript, like the bundlers and compile JavaScript on one hand, and then the people who try to use it without a build step just in the browser. And so I've been asking, for example, why do you need a bundler? Which bundling features do you most rely on? And from that, the idea is to try and identify, well, what is preventing us from moving to no-compile step JavaScript, browser-only JavaScript, or a runtime-only JavaScript maybe at some point in the future? So this is something where it was born of my own interest and how I could see the trends evolving, and I tried to formulate that into a question. I think that's where the experience of running those surveys for so long comes in, because I think I'm able to formulate the question that's going to maybe answer those concerns without tipping things too much in one direction or being too vague, and that's also a whole art.
Eric Meyer: You rely on your own intuition. Is there anyone else's intuition you consult in thinking about this sort of thing? Is there a board of directors or something like that?
Sacha Greif: There isn't a board of directors. There is a Discord where I will consult the community. Most of the work other than that is done on GitHub. Every survey has usually two threads, one for suggestions... So basically when I launched the 2025 survey, I also opened a suggestion thread for the 2026 survey where people can post their feedback. There's also a question in the survey itself about what questions do you think were missing from the survey, where people can contribute their feedback right there. And then about one month before the next year's survey, I'll compile all that into a preview version of the survey and then also have a GitHub thread for that where people can take the survey as a preview and then leave their feedback. And then we iterate on that until it becomes the final product. And then in terms of the more general methodology questions, I also work with an actual data scientist who reviews the surveys from time to time and gives me their feedback. Also, I know GitHub has some data scientists that have looked at the surveys. At the end of the day, the bottom leg is probably myself because I still need to aggregate all of that into a coherent product. But in terms of different inputs, there's definitely a lot to look at. Whenever I have time to take those things into account, I definitely have a lot of data to look at. Yeah.
Brian Kardell: So I think they're available in multiple languages now, right? They're not just biased toward English is the only way to ask the questions?
Sacha Greif: Yeah, so they are, I would say, translatable, but I wouldn't say they are necessarily translated because the process to do so isn't very good. There are volunteers who work on the translations each year, but I don't yet have enough lead time before the surveys to actually have them translated before they go out. So it's a bit haphazard, and also I just don't have time to manage that translation process myself. Because ideally I'd be like, 'Okay, here's the finalized survey. You can translate it,' and then have time to review the translations, be like, 'Oh, you missed that thing,' or 'Actually, this label changed from last year, so even though you translated it last year, you need to translate it again.' But I just don't have the logistics in place to do that right now. This is an area where I just don't have the time because I work on all this by myself. There isn't really anybody else managing the surveys, and so I kind of have to make choices, because that's how things are, at least for now.
Brian Kardell: Yeah. It seems like then another kind of bias, right? And how we'd like to deal with it is to be able to ask the question in multiple languages, to make the results available in multiple languages, but the shortcoming is that, just like everything else in the web platform, you only have so many hours in a day and only so much funding. There's only so many things you can do by yourself, and so-
Sacha Greif: Yeah. No, for sure.
Brian Kardell: If we could find funding to do that, if we could find the volunteers to help out, then I'm sure we could do a lot more.
Sacha Greif: Yeah, more funding is always nice, but I think it's also... I need to find a way to expand beyond just myself. I think, yeah, going from one person to an employee or something, that's kind of scary for me. I've never done that before. I've always worked solo on everything basically I've done. There are a couple of times where I tried to involve other people that didn't... Well, I enjoyed the experience of working with other people, but the other person always drops out at some point, so either, let's say, something about me, or just that it's hard to keep good people working on new projects. But yeah, I'm always open to new collaborators and I'm open to paying them, but it's still hard to find people.
Brian Kardell: Do we know about the demographics of the people that you're... Do we know their ages? What do we know about the people that you're surveying, and how diverse is it really?
Sacha Greif: Yeah, I know their country. I know which language they used to take the survey. I know age, gender, even race and ethnicity, which is a bit controversial, but I think it's important to at least have one source of demographic information in the web development community because I'm not aware of other surveys that ask about that. So, yeah, we do have that data. And of course, it is very skewed towards European people, people in the US or North America, white people, men. And I haven't seen anything that would indicate that, let's say, people in different countries use different CSS features. Of course, they might-
Brian Kardell: Some. Some. Because I would say some international features are more applicable in different parts of the world-
Sacha Greif: That's true. Okay.
Brian Kardell: ... is what I would say. Yeah.
Sacha Greif: No, that's true. Okay. That's a very good counterpoint, yeah, like right-to-left language features. Yeah. Apart from that, let's carve out a niche for those features, but things like RIDs, has, or JavaScript libraries, in terms of predicting the trends, I haven't seen anything that would say, 'Hey...' where different audiences diverge and gravitate towards different things in a meaningful way. So at least from that standpoint, I haven't... Maybe I'm going out on a limb and maybe I'm wrong, but I've yet to see indication that the blind spots in terms of audience lead to blind spots in terms of not being able to rely on the trends, if I can say that. I think it's a small comfort. But that doesn't really diminish or excuse the problems and the biases in the data.
Brian Kardell: Yeah, I'm not trying to consciously call you out on bias for this. I was actually intellectually curious myself if there are differences, because I don't see a reason that there should be in respect to this, but it's exactly the kind of surprise that wouldn't so much surprise me to find out that there were differences somehow. There are things that we just take for granted here in the US that we were like, 'Oh...' I mean, Google, that's a Google thing. But if you go to maybe Russia, it's a Yandex thing. Just depends where you go. What are the perspectives? And then I imagine that there's follow-ons to those. If you're in China, let's say, I think you will have this understanding of mini apps is a big thing. Here in the US, it's really not. So you have this perspective of what the model of the web should be going toward maybe or should be trying to do or should be wanting to answer that could be very different. Certainly there are fashion trends that wind up being different, and maybe that means you use different CSS features. I would just find it very fascinating to see a really diverse set of inputs and look for differences and then to try to want to follow up and say, 'Why? Why are they different?' Are there explanations for that, and are they useful that they're different, for us to know that they're different?
Sacha Greif: Yeah. It's not so much that the differences don't exist, but I think in terms of fulfilling the survey's role where it's predicting trends and trying to see what's up and coming, I think the survey can still do that job even with that demographic skew towards a certain audience. Again, it could be much better and it could also go further. If we did have data more respondents from the Middle East, from Arabic countries, then we would have more data about right-to-left writing systems and layouts and so on. So that would definitely make the surveys richer and better. At the end of the day, sometimes I get defensive because I know it's a problem, and it's something where I also don't want to just give up and be like, 'Okay, well, I'm not capable of fixing that, so I'm just going to give up.' So I try to see it more as, 'Well, I'm already providing value with what I do, and I'm just going to try to provide even more value to more people.'
Brian Kardell: Yeah. I would say we have a lot of listeners, and if you're maybe interested in helping with that problem and you have ideas, and especially if you have ideas and money, then write to Sacha and let Sacha know. There are things I can imagine, like partnering with companies in those parts of the world to say, 'Hey, maybe in addition to sponsoring this, you give your engineers the time to fill out this so that we get at least 100 respondents from here and 100 from there and 100 from there.' I can imagine ways to do it, but all of these things I think are iterative. We want to just keep getting better and better and better. And I actually really like that this all springs from just kind of a passion project that you did because you had an itch you wanted to scratch, and it's kind of grown and taken on its own thing with funding. Yeah.
Sacha Greif: Yeah, you're giving me ideas because one thing I've learned is trying to solve surveys issues with more surveys. So I could have a section in the next State of CSS specifically about right-to-left layouts. And of course, you might say, 'Well, 99% of people are not going to fill it out because they've never used that,' but then that in turn might attract a whole new audience because now they're like, 'Oh, wait, the survey is for us now,' and so-
Brian Kardell: Yeah, absolutely.
Sacha Greif: Yeah. I think there's always things you can do. And that's why it's also good for people to call me out because that's often what pushes me out of my comfort zone towards asking you questions, making new surveys, rethinking my own biases and the surveys' own biases. So I wouldn't say I always welcome criticism, because it can be tough to take, but eventually I think I kind of get to a point where I can actually use it and respond to it in a positive way.
Eric Meyer: So what are some things that you've always dreamed you could do, but you either haven't been able to or haven't managed to fit in? What is your ultimate ideal of... If money and time and resources were no object, what would you want to do with-
Sacha Greif: I would love to improve the localization system, like we discussed previously, and make it actually work. There's a lot of things to do in terms of just refactoring the code base because it's all a bit wonky and there's a lot of legacy code.
Eric Meyer: All code bases are wonky, so don't feel too bad.
Sacha Greif: That's true, but mine might be wonkier than most. I don't know.
Eric Meyer: Sure.
Sacha Greif: In terms of improving the surveys themselves, well, it's always cool to be able to play with new ways of asking questions. I think in 2023, Lea Verou was hired by Google to help with the first ever State of HTML survey, and she actually designed a really cool control UI element that lets people answer questions along two axes at the same time. So what I mean by that is you can in one click... It's almost like a matrix where you click both whether you have used a feature and also if your impression was positive or negative. And she designed it in a very elegant way. And that was possible because of Google's support, Google's funding, and her time. And we were even able to usability-test that feature so we could know that it would work before actually launching the survey and not have 10,000 responses where people misunderstood how it works and make the whole survey worthless. So because we tested it before, we avoided that, so it would be very cool to have more funding to do those kinds of things, usability testing, accessibility testing. There's people, like Adrian is always calling me out on accessibility, and he's totally right. There's accessibility issues that are long-standing that need to be fixed, but I don't always have... I know it's a cop-out, but it's often, again, time, knowledge on my part. Having more people so that I'm not just the only bottleneck for everything would be great. And then another thing that I would love to do is I accumulated so much data, and even things like metadata, about the ecosystem. So I have these huge lists of JavaScript libraries, CSS features, and also just people, developers, video creators and so on. I would love to make something with that, like a directory maybe, or enrich the surveys in some way. There's a lot of resources that's still untapped from the surveys. And of course, there's a lot more that could be done with the data. All the analysis I do on the data is extremely basic because my philosophy is, first of all, I'm not a trained data scientist, so I don't want to venture out too far into the weeds, but also I want people to be able to understand the data that I'm showing them even if they are not data scientists. So I'm a bit afraid that if I do something too complex, using more advanced math, I won't be able to explain it properly in a way that people can understand. But if I could call on more experienced data scientists, data visualization specialists, they could come up with ways to do this better maybe. In the past, I've worked with Amelia Wattenberger, for example, who is extremely talented and a specialist in data visualization, and she came up with things that I would be incapable of dreaming up and even implementing. So there's definitely a lot that could be done on a larger scope, I guess.
Brian Kardell: We have these different communities in the web, different focus areas, let's call them. So maybe you have people who build e-commerce, sort of pamphlet sites. There's maybe one group of people that build those kinds of things. And maybe there's another group of people who builds applications. And then there are people who can spend a whole entire career building intranet applications. I would say that their needs and their experience, they're not always the same. So every year in Interop, I'm a co-chair of the MathML working group, and every year in Interop, we get this like, 'Hey, let's get MathML finally to work well in all the browsers. Let's just pass all the tests. That would be a great thing.' And for a group of people on the web, that is a super, super, super important thing, but who that group is is pretty specific. It's not most people who are going to read your survey, for example. It's not. And I think this goes with your right-to-left text questions and things like that. There are certain people that's going to be very, very important to, and lots and lots of other people it's not going to be very important to. So I'm just curious, are you asking some questions to help look at the data in terms of... What do you primarily work on in this field? Do you ask even a question about do you do primarily intranet work or internet work?
Sacha Greif: Yeah, yeah, there is a question about that. It's mostly split along the axis of web apps versus websites versus print work, if you lay out PDF using CSS or versus, whatever it is, designing emails, I don't remember the options. But the majority of people, I would say, work on web apps these days, majority of survey respondents. So I think that's important to know. Because, to me, I've been doing web stuff for a long time, and there was a clear evolution from documents to apps where I think for a long time we were trying to build web apps with CSS, and it just wasn't the right tool for the job, and it's added those features like flexbox, grid over time to help with those very complex app layouts with a sidebar and heading and this and that. And so, yeah, I think it's interesting to see that evolution that the job description is kind of changing. And of course, there's also in parallel to that different job descriptions in the same job, in a way. I remember Chris Coyier had this great article about the great divide with the front of the front end and the back of the front end, and I feel there's-
Brian Kardell: That's a great article. Yeah.
Sacha Greif: Yeah. You might have the middle of the front end now because I feel... I would consider myself full stack in terms of back and front, but then I look at things in the state of HTML surveys, and there are so many APIs I've never used or barely heard about. So I know about CSS, I know about JavaScript, but then web APIs, native browser APIs are very unfamiliar to me, and I feel it's almost like a third job that's cropping up these days. So, yeah, I think you're right that trying to pinpoint within the world of web development what people are actually doing is pretty important and can inform their responses.
Brian Kardell: Both of you might know better than me, but wasn't Chris' Great Divide actually riffing on a Brad Frost piece, too, I think?
Sacha Greif: Yes.
Brian Kardell: I think Brad was the one that did the front of the front and middle of the back or whatever, right? Yeah, that was a great time for a lot of really good content that got created on basically that topic, and some good talks too. Chris had a really good talk that he gave here in 2019 or 2020 in Pittsburgh at Abstractions that I really enjoyed. Yeah. So is there any new stuff that you want to plug or...
Sacha Greif: I'm almost certain that our survey will be ongoing whenever you're hearing this, so you can go to stateofjs.com or stateofcss... Well, stateofcss is already done, but it's probably going to be stateofjs or stateofreact. So either one, if you happen to use React. And stay tuned for the State of HTML survey results dropping soon-ish, hopefully. What else? Next year, well, in 2026, we'll see what surveys I'm doing. I'm not sure yet. I did six of them in 2025. That was a bit much, but then, again, two of those were new, so we had State of Devs and State of AI. So maybe it's not going to be as much work to do that for the second time. Yeah, in terms of what's new, always new surveys to fill out, always new data to look at. State of CSS 2025, that data is out if you haven't seen it already, so 2025.stateofcss.com. And yeah, apart from that, just come say hello on the Discord or on Bluesky. I'm not on X anymore. I'm also on Mastodon. So I try to make myself available. And even if it's just to ask questions about the data, ask questions about whatever you want, I'm here.