Brian Kardell: All right. Hi. I'm Brian Kardell. I'm a developer advocate at Igalia.
Eric Meyer: And I'm Eric Meyer. I'm also a developer advocate at Igalia.
Brian Kardell: And on today's show we're going to talk with a guest. Do you want to introduce yourself?
Philip Jägenstedt: Sure. Hi. I'm Phillip Jägenstedt. I'm a software engineer on the Google Chrome team. Yeah. It's a pleasure to be here.
Eric Meyer: Thanks for being here.
Brian Kardell: Yeah, definitely. if you know Phillip, you probably know him as @foolip, is his handle on most things, which I think is just like-
Philip Jägenstedt: That's right.
Brian Kardell: The best screen name. I love it. It's maybe second only to @svgeesus which I think is great. Really great handle.
Philip Jägenstedt: Also a good one.
Brian Kardell: Yeah. Yeah, so we're going to talk about Interop 2023 which, I think, is it the third installment of this collaboration, or the fourth?
Philip Jägenstedt: We might count it as a third. Right?
Brian Kardell: Yeah.
Philip Jägenstedt: We had Interop 2022, and then before that we had something called Compat 2021.
Brian Kardell: Right.
Philip Jägenstedt: Well, we named it, but it's the third, spiritually, in the series.
Brian Kardell: And what it is is sort of a collaboration between browser vendors and people who work on the platform, the implementation and specs, to work together to prioritize improving the interoperability of things. Right?
Philip Jägenstedt: That's right. Yeah, and try to focus on things that we think matter to web developers and users, but mainly web developers.
Brian Kardell: So it seems like a weird thing. I'm going to just be honest. It's really great that we're doing this. Everybody's really excited about it, but just to step back, it feels like a thing that shouldn't exist ... Right?
Philip Jägenstedt: I can see what you mean.
Brian Kardell: Yeah, because if it is the standard, it should be interoperable. Right? Those two things go very hand in hand.
Philip Jägenstedt: I wonder if that's really the case in any kind of industry. Do you just write the standard, and then everything works?
Eric Meyer: It depends on the industry. I can actually speak to this a little bit. As an example, the DVD standard. The industry figures out how to make everything interoperable, and then once they've done that, then they write down the specification. Rather than writing everything down, and then trying to get people to adhere to what was written down, they literally just document what was worked out between the competitors in the marketplace.I mean, it used to be that the W3C was absolutely the complete reverse of that, where a working group would write a specification of, 'This is what we think people will want.' Then it was up to implementers to implement that or not, which is why we have a number of features, or had a number of features, in HTML, and CSS, and other 'standards' where things just never got implemented, because there was no interest from implementers in doing it.
Brian Kardell: Well, let's talk about where did we get the Interop effort. Do you want to give us some background?
Philip Jägenstedt: Sure. Yeah. We can go pretty far back. I think maybe we can talk about web platform tests and the origins of that. The Interop effort is not synonymous with web platform tests by any means, but I mean it's building on that foundation. Web platform tests, it feels like just a staple of our industry now, I guess. It's just there, but it wasn't always there. Before web platform tests, there were these pretty high profile tests, like the acid tests. There was an HTML test suite.Then somewhere along the way, I don't know the exact year it was, web platform tests was formed, bringing together, I think, the HTLM test suite. Maybe some dom test suites. Sort of over time it grew into a bigger and bigger thing, and eventually it just became the test suite for the web platform, or almost the whole web platform. We still have ECMAScript tested separately, for example. That is really a very long sort of progression from no Interop testing to Interop testing by default almost.
Brian Kardell: So Tim Berners-Lee created the first browser in, what, like 1991? And very quickly we got a bunch of browsers. We got the W3C in 1995. It was 2013 when Toby Langle blogged about reaching the amazing point where we had one percent of the platform then had web platform tests.
Philip Jägenstedt: Yeah. Since we're naming names, I don't know the exact dates, but my chart's 2013. Another person who did a lot for web platform tests early, and still is, is James Graham from Mozilla. I think he sort of built the whole infrastructure, so that's, I think, Mozilla overall. They were the first to use web platform tests as sort of one of their own test suites. Then Chromium came along I don't know how many years later. Three years, something like that, before Chromium started using it sort of as a serious test suite, I'd say. Yeah. 2013, that is pretty recent still. It's 10 years, but compared to the age of the web, it's recent. I suppose that's your point.
Brian Kardell: Yeah. I mean, I think that the whole effort really got started about halfway, or even a little past halfway, into the lifespan of the web, and it's huge. Right? I mean, the web, the API surface between 2013 and now is growing really, really, really fast. Right? So we have to create tests and get interoperability on every new thing that we add. Plus we still have so much to finish of the stuff that was around before. Right?
Philip Jägenstedt: Yeah.
Brian Kardell: I think the thing that I would like to mention, though, is the MDN Compat survey. Can somebody maybe?
Philip Jägenstedt: Yeah. I know something about that, but before the Compat survey there was the Developer Needs Assessment, Web DNA. That was in 2019, the first of those. That was a Mozilla effort. I think at the time it must've been the largest web developer survey that was ever done, and it was done two years in a row, 2019 and 2020. What we saw from that was a pretty clear sort of cluster of issues that we identified as top pain points that had to do with browser compatibility, or supporting different browsers, or avoiding a certain thing because it doesn't work in a browser, or something about layouts as well that had to do with this.Among the top 10 issues in the sort of main pain point question of those surveys, I think four or so were about this browser compatibility or interoperability problem. I noticed that, and this was sort of my thing even then. I wanted to learn more, so together with Mozilla we did this followup deep dive called the Browser Compatibility Report. That, again, was a survey. We asked sort of more detailed questions about, 'What parts of the web platform do you have trouble with?' We also did interviews with, I think, 10 or 15 people. Really long surveys. Sorry, really long interviews. We wrote that up all in a report, and sort of published our findings.Big picture, what you could say is a lot of the pain points had to do with CSS or layouts. That's where I focused a lot right after that, because that's what the data seemed to be pointing at. Not exclusively layout, but it sure was a sort of cluster of issues there. Flexbox and Grid were like the two main issues in that subcategory.
Philip Jägenstedt: But why do you reckon Grid and Flexbox? Why aren't people complaining about table layout and floats? Does it have something to do with the age of those features, or?
Brian Kardell: You bring up an interesting point with floats and table layout, which makes me want to say, 'Kids today complaining about interoperability.' Back in the day, back in my day, which maybe is an interesting point to turn and say, do you know who started the tests for CSS in the first place?
Philip Jägenstedt: Well, I haven't looked at the first commits. Who was it?
Brian Kardell: Well, he is the person who hasn't spoken much on this call.
Eric Meyer: And not talking in WPT. Brian's referring to the original CSS1 Test Suite, which I created in the mid '90s, mostly for my own edification, but then I shared it with the work. I shared it with @svgeesus, who we mentioned earlier, Chris Lilley, who was share of the working group at the time and said, 'Hey. Is this working group at all interested in having something like this, that I put together to try to figure out how CSS works?' He said, 'Would you mind if I shared this with browser vendors?' I was like, 'No, I wouldn't mind at all.' Yeah, that's what became the official CSS1 Test Suite. It was about 85% the test that I wrote, and then we added things like Acid1, the first acid test that Todd Furner put together. We got his permission to include that in the CSS1 Test Suite.
Brian Kardell: Right.
Philip Jägenstedt: So, as I said, a few features came out sort of on top. We had Flexbox. We had Grid, but we also had a cluster of issues that had to do with scrolling that were interesting. We took what we learned from that, and we tried to just put together a list of features that we could prioritize together and improve on. Google worked with Microsoft on that and came up with a list of five things, among them Flexbox and Grid, that we thought it would make sense to prioritize. Yeah. Then we announced that and called it Compat 2021. Compat after the browser Compat survey. Might as well call it Interop, and off we go. It was really off the success of Compat 2021 that I think everyone came to the table for Interop 2022, and now Interop 2023.This model of focusing on a small number of things, smallish, really made a difference. Focus, prioritization, is what we had tried for years to treat web platform tests as a sort of monolith, a whole, and try to have metrics for test failures and triage all failing tests. That sort of thing. But it was very difficult, and I think I've heard this was the sort of experience at Mozilla as well, that when you try to approach it as a whole, it's just this big bucket of mess, and nobody really knows what to do with it. I think boiling it down to a smaller number of things that we knew were important for other reasons, not just because they're failing, but because developers are struggling with those things. That really was key, I think, and that's the model that we've continued to follow.
Brian Kardell: Yeah. We also were involved in Compat 2021, and we thought it was a great effort. We liked that it expanded into 2022, and it got some more people involved. With 2022, I also wrote a Web Almanac chapter about that.
Philip Jägenstedt: Right.
Brian Kardell: You can go read. I think Eric and Phillip were both reviewers for that chapter.
Philip Jägenstedt: Mm-hmm.
Brian Kardell: I don't know if we'll continue that. I kind of hope we do. I think it's an interesting idea that, as you say, it's difficult for browser vendors, because the needs, the asks of the platform, are everything. The amount of things being actively worked on is still almost everything. That's being worked on by someone, but then you have to convince everybody else to work on it until it gets to done. How do you prioritize what to work on is very difficult, and how do you prioritize what things to assign engineering time to, and spec people to make sure that we have all of the details that we need for good interoperability. I think that's where Interop has been really successful. Another thing that I really like about what we've done with Interop is some mix of the things that we include are things that are brand new.
Philip Jägenstedt: Yeah, that's right. That wasn't a given that we should do that. Let's see. I don't think Compat 2021 had things that were totally unimplemented, but I'm not sure. Interop 2022 certainly did.
Brian Kardell: Aspect ratio I think was part of 2021. It was brand new.
Philip Jägenstedt: Yes. Yes, you're right. It was.
Brian Kardell: It's the only one that pops into my mind.
Philip Jägenstedt: I think if we have clear web developer demand for a new thing that solves a problem, and we can all agree that it's important, then it's great that we can do that. Usually when we talk about interoperability and testing, we're talking about the things that are implemented and the messy corner cases. I do think there's really value in prioritizing new things at the same time, and focusing really on the quality of those things as well. Get it right to begin with. Then maybe we don't have five years, or 10 years, of low-level pain before sort of all the rough edges are filed off.
Brian Kardell: Yeah. I just think there's such an outsized impact for these things that we do that are front loaded like that. I'm not even sure you can compare them, because I think it's in part related to the thing that, when you asked, 'Why do we have so much feedback about Grid and Flexbox, and not so much about things like tables and floats?' It's because those are promises that we haven't kept in a long time, tables and floats. Right? Whatever problems remain there, people have had to move passed that. They've had to find ways to work around it. It's not at the front of their attention anymore. Anyway, they want Grid and Flexbox for most of those challenges anyway. I hate to use this term, but it's like this is the new hotness. This is the thing that is meant to currently alleviate my pain. I like the way it's supposed to alleviate my pain. It's getting a lot of scrutiny, but I think that there's just this huge, huge difference between this idea of the new things and the old things.I think that the other part of this outsized difference is that I don't want to say you only get a chance to make a first impression once, but, regular developers, they have finite time too. Right? When the default mode is everybody does their own prioritization of all these things, it means that there's no, 'When does the new thing arrive?' It's like, 'We don't know.' We get our first implementation. We get a bunch of talks. We get a bunch of blogs. We get maybe some podcasts, and a whole bunch of people go, 'Wow. Cool. I can't wait to use that. Let me pull it up,' and they're like, 'Oh, it works in only one browser.' You know, that can carry on for, in some cases, historically, for five, six, 10 years, before the first one and the last one. There's real challenges with that, because implementation scrutiny often helps shape the spec. By the time you get to the third spec, sometimes you have to debate about, 'Well, what are we going to do now? Because this doesn't make sense the way that it's implemented in the first one.'Yeah. I mean, I think that in every aspect of this there's just a really, really outsized impact to being able to communicate a thing, and have a degree of trust, and have a good feedback loop. I can't say enough about how much I think we probably gain from front loading prioritization.
Philip Jägenstedt: Yeah. I think that's exactly right. When there's excitement for a feature, and then you realize you can't actually use it yet, then I guess you forget about it. I think the web platform is more capable than developers think it is, because they assume things are worse than they are based on experience. I think that's probably happening. Yeah. What we have is a coordination problem. We're all implementing a bunch of features, but at different times, and announcing them at our own pace, so it's totally uncoordinated. I think if we coordinated a little bit, then it's actually just a win-win for everyone. It's less work for everyone, and the result is better. Yeah. I view it as a coordination problem. Even if sort of Interop shouldn't exist in a sense, then I think coordination should exist because, for developers, the web platform is not just one browser. It's whatever browsers they have in their support matrix. Until it's there solidly, then the feature doesn't exist, so coordination.
Brian Kardell: To really add some clarity to the thing that I was saying about Interop shouldn't exist, Interop should exist. There shouldn't be a separate project called Interop. In an ideal world, this is what standards would do.
Eric Meyer: Right.
Philip Jägenstedt: Standards and tests, right? Because I think part of just what-
Eric Meyer: Part of standards?
Philip Jägenstedt: Well, maybe in an ideal world.
Brian Kardell: Yeah, right.
Philip Jägenstedt: There's an interesting sort of other universe, which is ECMAScript, which doesn't have an Interop program, and maybe doesn't need one. I'm not super into ECMAScript and TC39, but from a distance it looks like things work quite differently there. The test suites looks pretty solid. Browsers generally just try to pass all the tests, and then say the feature is done, so there's none of this sort of everyone passes 80% of the tests, and it's not the same 80%. If the tests are solid, and everyone tries to pass them all, then there you go. You have Interop without all this coordination. There is a different model out there, and I'm always interested in sort of what can we learn from that. Is there a downside to it? Does it move slower because of it? I'm not totally sure about that, but, yeah, it sure is different.
Brian Kardell: ECMA262, the test suite, also similarly added late in the game. I think 2010 is when that came about. I think that it was first used for ES6. Yeah, it's interesting. I think they're all different. I mean, I know we have people here who do work in other standards bodies like Chronos, for example. Vulcan can form a test suite, just as an example. Yeah, it's interesting. There's so many models to this. I really do think that would be an interesting show all by itself to just talk about different models and everything.
Philip Jägenstedt: Yeah, and about what the plays of testing and the standards process is, because that's also radically different between the WG, with the C working groups, and TC39.
Brian Kardell: Yeah, and what you call a standard. Right? This is maybe an interesting thing to broach into how, also, this is very difficult to discuss in a way that is really productive. Right? Because that's what we care most about. I mean, at least us. That's what we care most about, just being really as productive as we can. There are lots of potentially interesting ways to look at this information. Right? You could say, 'Well, what only fails a test in a single browser?' That's an interesting thing. Right?
Philip Jägenstedt: Mm-hmm.
Brian Kardell: That would be an interesting thing to look at. It could be a way to help organize your priorities.
Philip Jägenstedt: Yeah. We've tried that one.
Brian Kardell: There are actually things that are in the HTML standard that have tests. They're implemented into browsers, and that one browser has been vocal since the beginning that they think that that's just wrong. You know? They're not going to add it. I don't know. Is it standard or isn't it? Right? That's tricky. I think that the most useful definition of a standard is where the rubber hits the road. Right? When all three of them agree, and we have interoperability, that's a really good standard.
Philip Jägenstedt: Well, even the W3C doesn't have that part. In a way, the YWG has a higher bar, and in a way the W3C has a higher bar. They're not exactly comparable. The YWG sort of working mode is if two implementers want to work in something, then it's worth writing down so that can be interoperable. If a third implementation doesn't care so much about the thing, it can still go in to spec. The W3C, I guess we're talking about the exit criteria from, I don't know, wreck or something, and having two implementations pass tests. There, also, it's not saying all implementations needs to support the feature for it to be a recommendation. Right?
Brian Kardell: Yeah. I mean, even, say, Test262 has similar debatable thing. What does it mean that you need to have so many implementations? For a lot of us, what we mean is the browser implementations, but that's not necessarily the case. Conceivably, at least. I'm not sure if it's still this way, but it was a few years ago. You could have zero browser implementations, but you have implementations in some other engines, so that that counts. In practice, that doesn't seem to be a big problem, but it does create situations like this where it's a little bit tricky to define.
Philip Jägenstedt: Yeah. I mean, this is something that I think probably sort of should be left not to standards organizations, but to places like MDN to say, 'Hey, developers. These are things that you can rely on.'
Brian Kardell: Yeah.
Philip Jägenstedt: I mean, who cares what color the spec is, or if it says it's good or not? Ultimately, what-
Brian Kardell: Yeah, that's my position.
Philip Jägenstedt: Yeah.
Brian Kardell: Yeah.
Philip Jägenstedt: Good. Yeah, but that doesn't quite exist. If you could go to a feature and see, yes, I can depend on this and call it standard, or call it what you will, that's what matters at the end of the day. Not the spec status, say, or the test results.
Brian Kardell: I hold up a lot. What a speech, because it's not a standard track thing, and it's implemented in all three engines.
Philip Jägenstedt: Yep.
Brian Kardell: All the rest is great. It's all great ways to get agreement, and get prioritization, and interoperability, but they're all part of the process. Right? The ultimate thing that, as a developer, you want is to know that works everywhere. Right? I think that's part of the reason why Interop is great is because we want everybody to have that feeling that, 'Oh, that's good. That works everywhere.'
Philip Jägenstedt: Yeah, exactly. I hope that that's something web developers get excited about that. They can depend on new features actually arriving, and then being solid from the get go. Take a feature, HAS, a pseudo-class, or that. You know? They see it's in this project, and soon enough they're going to be able to rely on it. Next to that, you have fixes to existing features like Grid, Flexbox, even border image. This ancient technology still needs some polish.
Brian Kardell: I think, and I could be wrong. This is just my own personal experience and the experience of people that I have talked to, but it does feel a lot of times like we get this new cycle. This thing is coming. This thing is coming. This thing is coming. Then I don't know. Maybe it takes kind of a long time. I know you just used the example of HAS. Do you know when the first time that concept entered the CSS specifications?
Philip Jägenstedt: I don't know. As HAS, I suppose it can't be more than a few years, but-
Brian Kardell: No, no. It has been a long time. It was originally introduced by Danielle Glassman in 1998. We were already starting CSS3 specifications. Then it was just going to be CSS3. It was in there as a subject, but that led to lots of conversation. Very, very quickly it became HAS. A lot of people know that jQuery had HAS. The reason that jQuery had HAS is because when John Resig was making that work with CSS selectors, it was in specifications already for a long time. Like John, I was positive that any moment HAS was going to be supported by my browser, and someday that was coming because it's in the specs. Right? Then it got punted to selectors level four. For many, many years we've been working on selectors level four. For a couple of times, we've discussed punting it to selectors level five.
Philip Jägenstedt: If I'm not mistaken, the thing that ultimately unblocked that had something to do with work that Egalia did proving that it could be done performantly. Is that right?
Brian Kardell: It absolutely is true. I wrote this post in 2019 called Beyond Browser Vendors, and what it lays out is this concept that, effectively, there are just way too many problems to work on, and not enough people. We do work in this completely uncoordinated fashion generally. Right? That is ...
Philip Jägenstedt: Yeah. That's our default.
Brian Kardell: There is nothing that says your web engine team must contain this many people, and they must all have the same throughput capability, and all of the same diversities of specializations. Nobody is allowed to get sick or have parental leave. Right? I mean, everybody is managing their own queue, and they all have very, very different abilities and budgets. As you say, it is just a giant, giant queue that you have to work. That means that sometimes, unfortunately far too much, ideas don't advance because we're pretty sure we can't make them happen. We're pretty sure that that's like a burn of valuable time. If there are things that have been discussed a lot that vendors say we can't imagine how that could possibly work, then if you bring it up again, vendors are not super keen to want to burn their resources on that. You know?So if you have something questionable like HAS that really works in sort of the opposite way that all of the other CSS optimizations have developed over the years, you have to basically collect a lot of data. You have to build a prototype. You have to do a lot of research and really show that it can be done, and unblock the sort of nuclear standoff that happens on just discussing whether you're willing to do it.
Eric Meyer: Does it seem that way from your perspective, Philip?
Philip Jägenstedt: Yeah. I'm interested in these features that have seemed impossible for a long time, and then they happened. I haven't witnessed any of those up close myself, but I think container queries is like that, and HAS is like that.
Brian Kardell: We also had something to do with that.
Philip Jägenstedt: Yeah?
Eric Meyer: Yeah, HAS container queries. Grid, for that matter, seemed impossible forever.
Philip Jägenstedt: We didn't discuss this before when we talked about standardization, but I think the role of prototyping is really important. Just to prove if something is possible really can advance the discussion from we think this wouldn't be performant to having a prototype that works fine. I think even the W3C can seem kind of waterfall like when you first write the spec, and then you go and implement, but I think a lot of the time, when it's working well, prototyping has happened sort of in the background and got things rolling. There is a sort of momentum effect that I sort of keep noticing. Once something gets rolling, there's a sort of momentum, and you sort of want to jump on that, because if it stops moving then it's going to be maybe even harder to get it moving again.If there is interest, certainly I would. Let's say I was working on a full screen, and another vendor comes along and starts filing issues. I would sort of jump on that and say, 'Okay. It looks like they're working on this right now, so this would be a good time for me to prioritize that as well,' because, well, collaboration works. Yeah. That builds a kind of momentum for sure.
Brian Kardell: Yeah. I think that that is largely what Interop is. Right? I mean, that is all of us going, 'Okay. Collaboration really works.' We can't afford to collaborate on all the things, but we can definitely afford to coordinate on a lot of things. Let's talk about what those things should be this year.
Eric Meyer: Yeah. I also, as a developer, not as a participant in Interop, but as someone trying to look at it from the outside, what I really like about it is it tells me this is what browser makers not only agree on, but think is achievable. Right? Which maybe not everyone quite perceives that, but just that idea of, okay, so these are new things that are going to be worked on across the board for the next year is the plan. Now you can look at it and say, 'Oh, everyone's working on container queries, or cascade layers,' or whatever it is. Sometimes it can be, 'Oh, yeah. Jeez. This is a technology that has long been a paint point because it doesn't work right consistently across browsers,' and the browser makers have recognized that that's the case, and they're working on fixing it, or as with the Interop 2022 and 2023, 'Wow, everyone's pulling together to implement new color formats so that I can OK Lab. Maybe in a year's time or less.'I think actually it's a sign of a little bit more maturity in the space. Right? Rather than every browser team running off and doing their thing. Sometimes what they do overlaps, and sometimes it doesn't. There's actually a sense of a community. Right? A community of browser makers, or engine makers, if you want to put it that way, saying, 'These are the things that we all have agreed to work on. Between us, we have agreed to work on these things in the coming year.
Philip Jägenstedt: Yeah, community. That's right. I do feel like we are coming together as a community, and it's a friendly bunch of people getting together and just trying to make things better.
Eric Meyer: Right.
Philip Jägenstedt: There's a clever hack here, actually, though, because we don't all want to promise we're going to ship this thing for sure. There's a level of interaction putting a number on it, and a metric, and we're happy for these numbers to be out there and make us look bad if we don't work on it, but that's not exactly the same as promising. I think it's reasonable to assume that we announced this project, and those are the things we're going to do. Of course, we wouldn't announce it if that wasn't our intention. It's not the same as saying, 'Here's our road map. Here are the things that you can expect by such and such date.' That level of interaction I think helps.
Eric Meyer: So what's in 2023 this year?
Philip Jägenstedt: 26 things.
Brian Kardell: Can we talk about not just the things, but sort of how we got the things, and what's different about 2023?
Philip Jägenstedt: Yeah, let's do that. How far back do we go? Well, we got together and said, 'Let's do Interop 2023,' and started to write down a process. That process is one of public proposals for specific features. We sort of announced to the world, 'Hey. We're going to do another round of this Interop project. Please submit your proposals.' Of course, us browser vendors submitted a lot of proposals, but we also got a really good number of proposals from sort of outside the core group of companies. In total, we got 87 unique proposals to deal with. Through a long and laborious process, that was narrowed down to 26, which includes some that are carried over from 2022. I think the interesting part of the process is maybe how do we get from those 86 to the 26? Sorry, the 87 to the 26. What happened there?
Brian Kardell: I think it would be fair to say that not just in terms of a number, but in terms of the diversity of things in here, the amount we're kind of biting off, is more in 2023.
Eric Meyer: Oh, yeah.
Brian Kardell: By a significant margin.
Philip Jägenstedt: Yeah. It's absolutely a bigger chunk of work than in 2022, which I'm really happy about. It's a more diverse set of features. 2022 was quite CSS heavy. This year is as well, but we also have features like web Codex, pointer and mouse events, off-screen canvas, so just more variety.
Eric Meyer: Yeah, because the original Compat, wasn't it all CSS, or it was 80% CSS? Something like that?
Philip Jägenstedt: It was all CSS, yeah.
Eric Meyer: Right.
Philip Jägenstedt: It was aspect ratio, Flexbox, Grid, transforms, and sticky positioning.
Eric Meyer: Yeah. Okay, right. Yeah, so I think it's been a natural evolution to start bringing in more stuff. Now that the model is sort of ... I don't know if proven is the right word, but it's been demonstrated that this is going to be a regular thing, now it's not just CSS. There's CSS. There's some HTML. There's some ECMAScript. There's, like you said, web Codex. Things like that, which is also good evolution. I really like that. Yeah. I mean, there was this process where all the participants, so Google, Microsoft, Mozilla, Boku, Agaya, Apple. Shoot. I forgot Apple. I'm sorry. Anyway, got together in calls and said, 'Okay. What is everyone's position on each of these proposals? Are you strongly in favor?' Basically, people indicated, or teams indicated, whether they were strongly in favor, or if they were opposed for some reason.Opposition could be anything from, 'Well, somebody proposed this, but there's no specification yet, so we can't. We don't have anything to interoperate around,' or it could be, 'We don't have the resources to do everything, and we don't think we can get to this one this year,' or really any other reasons. Those were all discussed. Right?
Philip Jägenstedt: For each proposal individually, we're establishing whether we have consensus to include it. If we don't, it's not included. I think that's what I've talked about before. We're sort of filtering to the things we have consensus on. It's important plus consensus. I think it has to be. It's very intentional. Okay, so my part. It worked like this for 2022 that we can't have a process which tries to force someone to prioritize something they don't want to prioritize, because I just don't see where that power would come from. Right? We're different companies. Why would we do things we don't want to do? It's sort of built into the process that we only do things that we can all agree to, and we have almost a no questions asked policy. If someone says no and doesn't want to elaborate, that's okay, because what else could we do? We can't force each other to reveal secrets or do things that we don't want to do.
Eric Meyer: Right.
Brian Kardell: Yeah.
Eric Meyer: Very true.
Philip Jägenstedt: The basic step here was if we each proposal individually, do we have consensus? Then we were left with quite a lot of proposals. More than we finally included, because we started sort of merging proposals into bigger buckets. We were left with a bunch of proposals, and then we had this process of merging, and renaming, and coming up with things that would make sense and be explainable, until we finally ended up with these 26.
Brian Kardell: In the past years, I don't think that we had a process the same way, but we did try to ask people. Right? At least I did. What do you think would be valuable to include in here?
Philip Jägenstedt: Yeah. It was, in principle, an open process for '22 as well, but we didn't announce it as broadly, and maybe we didn't even get any proposals from non browser vendors or non-participants.
Brian Kardell: I don't think we did. That's where I was going with this. I know I said this in our meeting as well. If we would have coordinated something with the media outlets who are all very interested in any time all of the vendors come together, if we would have had an even wider call, and we would have left it open even longer, I think we'd have had 400 instead of 85, because everything is important to someone. Right? That is the challenge, then, is how do you prioritize it? It's very difficult.
Philip Jägenstedt: Yeah. I don't know if we could deal with ten times as many proposals as we had this time.
Brian Kardell: I think we could not, honestly. I'm just being honest. Yeah.
Philip Jägenstedt: Yeah, and not just the number of proposals, I don't think. I talked before about the value of trimming it down, and not have this huge bucket of everything. 26 is a big number. If we double that, it doesn't really fit on a screen anymore. It becomes hard to reason about, so I don't think we are going to keep doubling the size of the Interop program every year until we cover everything. That doesn't quite work together with focus, because for something to be important, there has to be something left out.
Brian Kardell: Yeah, exactly. Exactly. I think that there's a lot of value in having a relatively small forcing function number. Right? A relatively small number as a forcing function. Somewhere around 25 is probably, I think, hitting up near the max. It might even be too high already. We'll see.
Philip Jägenstedt: Yeah. Yeah. We've talked loosely about just limiting the number of focus areas to 26, or maybe 25, so that we have to finish things before we include new things. If we do that, we'd have to have a slightly different process than what we've had in the previous years.
Brian Kardell: Can I talk about a thing that we added this year that I think is the best thing, and I have real regrets that we didn't add it in the past? I don't know who actually did the addition this year, but we have this Interop chart at the bottom of the thing where you can look at the progress over the course of the year. You know?
Philip Jägenstedt: Mm-hmm.
Brian Kardell: It's this lines that go across. We have one for Chrome Edge, or Edgium, as I like to call it.
Eric Meyer: Me too.
Brian Kardell: Firefox and Safari. I guess really it's Gecko and WebKit. Then we have a fourth line that's called the Interop line, and that's the number that passes in all three. Going back to that, one interesting view that you could take is what is implemented in 203 browsers. Right? That's one way you can look at it. We also talked about, well, one way you can look at what is the standard is what's implemented in all three regardless of what's in the spec and where. Right? What has really high interoperability? This way of looking at it is actually much, much better, because when we see it on the chart, we see that, for example, you look at across all focus areas we have 65 to 80% in all the browsers at the beginning, but the Compat line is only 50%. That means only 50% of things pass on all three browsers, and that's the one that matters at the end of the day. Right?
Philip Jägenstedt: Of course, that's because in the sort of Venn diagram of passing or failing tests, we're not passing or failing exactly the same tests. That Interop number can only be smaller than the browser scores.
Brian Kardell: Yeah.
Philip Jägenstedt: Yeah. I think it'll be exciting to see that number go up over the year. In past years, getting past 90 has been a kind of milestone. I hope that'll happen both for all browsers individually, and for this Interop score. That will be really great to see.
Brian Kardell: I don't know if this makes me seem dull or something, but I always have looked at those charts as if probably the lowest number on that chart was the degree of Interop. Right? But this additional line makes it very clear that, well, no. Obviously that's not the case. Once you understand that, that is a really different kind of understanding, and I think a more valuable one.
Philip Jägenstedt: So does that just make you even more pessimistic? It's even worse than you thought?
Brian Kardell: I mean, I think it's just more honest. I mean, I don't have an optimistic or pessimistic view of this. Right? I think if I want to understand a single metric, that one is the one that seems to be the one I should care about. You know? Like-
Philip Jägenstedt: Right. If we want to put a single number on how well we're doing on these 26 focus agrees, then yeah. I agree. That's the one.
Brian Kardell: Even in stable, we're already up to 70% in all active focus areas. You know, it's only March.
Philip Jägenstedt: Yeah. I get the sense that we're going to make more progress faster this year. I think we'll get to 90 before ... mid-year, say. I think we're going to end up closer to 100, for the most part, than for Interop 2022, and it's because there's a lot more, I think, excitement around it, and it just feels bigger and more important this time around.
Brian Kardell: Good.
Philip Jägenstedt: Yeah. I'm really optimistic about that number.
Brian Kardell: Is there anything else that you want to make sure that we get covered?
Philip Jägenstedt: Well, maybe I just would like to hammer the point about developer pain points, or developer needs again. The point of all of this is that the web platform isn't always a great place for developers. I mean, yes it's better than in the bad old days of spacer gaps and what not, but it's still plenty frustrating. Differences between browsers, it looks like the top pain point for developers. Sometimes I jokingly say that my job is to reduce the number of web developer tears per second, and I think this is the main way that I'm trying to do that.
Brian Kardell: Do you file a TPS report at work?
Philip Jägenstedt: No, only jokingly. Ideally, I think what this process, or this program, would be is we get sort of input from developers about what matters most, where the most frustration is, and we have a shared understanding of those pain points. Just like we did with the MDN DNA surveys, and the Compat report on the state of CSS. Now, none of these are perfect individually, or even put together, but the more we have of this information out there in public, and when we have a shared understanding of where the biggest problems are, then we can prioritize together the things that we think will matter the most. Ideally, this turns into a little sort of engine of improving developer happiness, or success, or what have you. That's what this all is anchored to for me. I also love the web standards and the details of testing. I can get deep into that, but it's for developers to cry less at night in the end.
Eric Meyer: Yeah. I think that's a good place to end it. Thank you so much, Philip, for ... sharing your time and your insights with us.
Philip Jägenstedt: Well, thank you. It's been a pleasure. Let's keep interoping.