Home Page
cover of Vitalik Interview (Audio)
Vitalik Interview (Audio)

Vitalik Interview (Audio)

sandy

0 followers

00:00-51:52

Nothing to say, yet

Podcastspeechconversationchild speechkid speakingchuckle
0
Plays
0
Downloads
0
Shares

Audio hosting, extended storage and much more

AI Mastering

Transcription

The speaker discusses their breakfast routine, mentioning that they eat 90% dark chocolate. They then talk about their experiences with experimental community events, highlighting ZuzuWoo and East Denver as particularly interesting. They also mention their involvement with Zozolo, an Ethereum-connected community, and their exploration of AI tools. Moving on to Ethereum-related questions, they explain the purpose of the Gencrune upgrade, which aims to increase scalability and reduce transaction fees for Layer 2s. They mention the decrease in Layer 2 fees since the upgrade and anticipate further scalability gains in the future. The speaker envisions Ethereum's impact on society in the next five years, emphasizing the importance of user experience and low fees for the success of stable coins. They also discuss the potential for non-financial applications, such as alternative social media platforms and Ethereum-based identity systems, which they hope will provide decentralized solutions to into a bowl, and then that's my breakfast. So it's sweet? It's not sweet. It's not sweet? Chocolate? No, 90% chocolate. Wow. When I say chocolate, I always mean dark chocolate. I don't mean sugar with a little bit of chocolate. It's sweetened. Exactly. Okay. Okay. Interesting. Next one. Also about travel. When you travel, leave around what experimental community event inspired you the most? This is from Method Town. Yeah. I guess it depends on what you mean by an experimental community event. Right? I mean, obviously, you know, ZuzuWoo last year was probably the biggest thing in that category, and there was definitely a lot of interesting groups and events and activities inside of there that I found really fascinating, both in terms of actually using Ethereum and actually using zero-knowledge proofs and those things, but also experimenting on the health side, experimenting on the community side. In terms of the other big thing, which is different Ethereum events that I go to, I mean, I definitely respect East Denver for just being interesting and cool, and in a bunch of ways, they had me wear a buffalo suit. I mean, I do know that it's gotten a lot bigger in the last couple of years, and I haven't been to the last couple of years, but the ones that I've been to were definitely fun. The one in Vietnam that I was just at were fun. I think this was the first one where the entire venue was outdoors, and they made it really fun, and they found a venue that's this nice thing that's kind of similar to the West Lake in Hangzhou. That's the closest reference I have. It was last week, right? Yeah. It was last week. Definitely by far the most COVID-safe conference I've ever been to. Yeah, that was cool, but yeah, I definitely want to see more experimentation and different event styles, and not just bouncing around between Hilton's and various spin-offs that are actually owned by Hilton every other week. Okay. That's my LFAS. Thank you. Okay. Next up, what are you most excited to work on these days? It doesn't have to be like Ethereum. Oh. Okay, well, since I'll answer about Ethereum and all the other questions, I guess I'll focus on the non-Ethereum stuff that I've been doing. What am I doing that's non-Ethereum related? Well, again, Zozolo is continuing this year, and it's like Ethereum connected, but the community obviously goes beyond that, and there's a bunch of independent groups that are making their own various villages this year. So I'm following along, helping them when I can, excited to actually get a chance to visit some of them soon. I've been playing around with some of the recent AI tools and running some of the models locally and using them to draw and do other things. So that's a really important space to just get more understanding in. What other things? Again, just learning. I think there's a lot in learning about some of everything. Those are the big ones. Next, we're going to have some AI-related questions. Oh, by the way, this person is from Coinsider. All right, so let's go back to Ethereum-related questions. A lot of media want to ask, including Rock Radio, Miraculous Post, Foursquad News... Look at Rock Radio, Grenade, wow, these are pretty violent names. Yeah, that's what I thought. The blockchain group media. All right, so the question is about Gencrune. So how does the Gencrune upgrade contribute to the Ethereum ecosystem? Does it meet your expectations? Yeah, so the purpose of the Gencrune upgrade is to massively increase the scalability and reduce the transaction fees that are paid by Layer 2s, particularly roll-ups. And the way that it does this is by creating an independent section of data space inside of each block. This is what are called blobs. And this data cannot be accessed by the EVM, which is important, because what that also means is that when a client is verifying an Ethereum block, they don't have to have access to that data at exactly the same time. But the data is data that the Ethereum blockchain guarantees is available. And so this is very useful by roll-ups and, in principle, any Layer 2 project that depends on data being available for its security. So that, for example, to ensure that other nodes can sync if the current nodes disappear, or to ensure that if someone sees a wrong answer, they can challenge and things like this. And we have already seen, over the past week, Layer 2 fees decrease massively, in some cases by a factor of like 50. Now, I think it is important to warn that there is a possibility that these fees will increase again as there come to be more users of blobs, because as there come to be more users, eventually the fees will go up. But it is still a pretty significant scalability gain. And we expect the number of blobs that the Ethereum chain supports to continue to increase greatly over the next few years. Is it better than you expected? I think it depends what you mean by better. So from a technological perspective, the upgrade went by flawlessly. I think the percent of testers only went down from 99 to 95%, which is better than any other forks that we've had. The amount of usage is interestingly low. Right now, the target is about 3 blobs per block, but the average amount used is only 1 blob per block, which means that right now, blobs are incredibly cheap. If you want to publish a blob, you basically only have to pay Ethereum transaction fees. It's possible that Ethereum transaction fees being high is one of the reasons for this, but if the price of blobs tends to zero, then you can just go use them to back up an encrypted copy of your hard drive or whatever. There's an infinite number of uses for a guaranteed data space. So I do think it will increase eventually. It is definitely good for present-day roll-ups that it's very cheap, but definitely looking forward to usage going up over the next few months. It's also good if the usage goes up. Exactly. So what do you envision as the single most transformative impact Ethereum will have on society in the next five years? I think the next five years are going to be critical for Ethereum because a lot of the applications that have been fairly theoretical and fairly small-scale are actually starting to get to the point where they're ready for real-world use. And I think in terms of the blockchain space really affecting the real world, one big impact that it already has had is just ideas generated by the space have really filtered into the broader world in a lot of ways that have not been appreciated. So for example, Reddit is doing their IPO soon, and one of the things that they've done is they've given people who have been active in the community, like very active contributors, moderators, the ability to participate on the same terms as institutional investors. And I think that kind of approach is definitely something where crypto applications have really helped blaze the trail and legitimize and set the standard. But aside from that, basically the biggest impact, like actual use has been I think stable coins, right? And people just using stable coins to save money, to trade, to misconduct transactions. And I think what will happen over the next five years, right, is that first of all, in order to succeed in that space, you need to have very good UX and you need to have very low fees. And historically, Ethereum has not had those things, but over the next five years, Ethereum will start to have those things. And we're already seeing layer twos and things like base already starting to get there. But I think that will continue improving. And once that gets there, then I expect Ethereum to be a very leading player and just helping to make stable coins accessible to people in a way that is actually open, actually decentralized and that actually doesn't require trusting fragile third parties. But going beyond that, I expect non-financial applications to really start to make a much larger impact. Right. So, you know, we've been seeing for the past year the success of Forecaster especially and also to some degree Lens and a few others in creating alternatives to social media platforms. And, you know, there's a big desire to have alternatives to things like Twitter and Facebook at the moment. And, you know, these projects, I think, are doing a very good job of leading. And I expect for that to continue. And I expect that the particular benefit that decentralization gives to these projects, which is basically that anyone can write a new client. And if you have a new client, you can access and you can write to the same content and you don't have to build your network effect from scratch. I expect that benefit to become much more very clearly visible to a lot of people. And, you know, that's actually see a very vibrant ecosystem of people creating different clients, of people creating different ways to see the same content. So I think that space is going to be interesting. The Ethereum-based identity space, I expect, it is growing quickly. The technology is improving quickly. And I am really hoping it will see some mainstream usage. So this includes, you know, different proof-of-personhood protocols, whether that's proof of humanity or the WorldCoin system or IDENA or any of those. It includes social graph-based things like HOPE and ZooStamps. It includes things like Gitcoin Passport. Basically, one of the big challenges that a lot of people are worrying about right now is how to prove that an account on some platform actually is a person, as opposed to being a bot or being one of a million bots or accounts controlled by the same person. And the big risk I see is that when people need to solve that problem, people will jump to centralized solutions. And centralized solutions, I think, are going to have very bad risks in terms of excluding all kinds of people. And I hope that the Ethereum space can actually lead in coming up with some of those decentralized alternatives and making them really accessible. Yeah, so I think all of those things are important and I think are going to continue to grow. Cool, what a beautiful vision. So the next one is a challenge. Can you explain how you see the current challenges in Ethereum proof-of-stake and how single-slot funnality and other updates could address this issue? Is it rainbow-staking? Yeah, so I think the big challenges in proof-of-staking right now largely have to do with various centralization risks. So the big ones that people are concerned about, one is on the MEV side and the others are risks to do with the function of staking and being a validator and a testing by itself. Basically, the challenge on the MEV side is that we're seeing growing builder centralization and builder censorship and relays are emerging as this other kind of centralized actor. And there's a set of techniques, one of them is execution tickets, which is basically the latest form of what we call PBS, Proposer-Builder Separation, except it's probably more accurate to call it Attester-Builder Separation. And then the other one is inclusion lists. And I think the way to think about inclusion lists is basically to try to get back as close as possible to roughly the kind of world that Ethereum was in, let's say, 2016, when you would have decentralized validators be responsible for creating blocks. Except here, what we mean by a block is we just mean a list of transactions, like which transactions get included, because that's the most important thing from a fairness and censorship resistance perspective. And then actors like builders would be responsible particularly for some very specific set of functions that are important, but that do have centralization risks to them. So one is like basically block prefix and block ordering, like the part of creating a block that could involve capturing MEV. Another is computing the state routes, like running the EVM computation, eventually doing a dk-SNARK of that computation, then doing aggregation tasks. So signature aggregation, ESP-4337 aggregation, potentially aggregating signatures and proofs from Layer 2s. So basically all of that centralized stuff would be done by builders that would be selected through an auction with execution tickets. And the parts of block pre-creation that are just more relevant to the protocols permissionlessness, which is just choosing transactions, would be done by validators in a way that's less vulnerable to censorship. So that's one part. Another part has to do with staking itself, right? And I did a series of polls recently, both on Farkaster and on the EatSciPage presentation that I did earlier, where I basically asked, if you are not staking, then why? And by far the biggest answer that people gave is that they have less than 32 EATs. And probably the second biggest answer is that people have not— running a node is too difficult. So for running a node being too difficult, we already have a pre-existing technical roadmap that is basically intended to solve this, right? So the next big step is, for a long time, has been Verkle trees. And Verkle trees have actually made a lot of progress. Like there are active test sets that are working implementations of it. And with the Verkle trees as a node, you would not have to store the state locally. And with EIP4444 history expiry, you would also not have to store most of the history locally. And so the amount of data that you would need to be a node would decrease from multiple terabytes to basically being able to, in principle, run a node in RAM. And once we have that, then, for example, re-syncing and syncing will become much faster. Potentially it could be a few minutes. And it will just become a much simpler task that's much easier for people to do. And then with zk-SNARKs, the requirements are going to decrease further. Like I think in the long term, running a node will feel like downloading some data, hashing it, and verifying a SNARK. Just a few very simple computations that will be very easy to do as just a background process on any computer, maybe even a phone, even inside the browser. And there's a pre-existing technology roadmap to get to that point. The less than 32 ETH part is more challenging. And basically because originally 32 ETH was this compromise between not requiring too much ETH to be a staker and not having too many stakers because that would make blocks too hard to process. And there's some new approaches to proof-of-stake that basically involve sacrificing the requirements that every staker participates in every round of consensus. And if you sacrifice that requirement, then you're able to get the benefits of single-slot finality, so you have finality after one slot, and the benefits of, for example, being able to stake with less than 32 ETH and having simpler and lighter nodes all at the same time. And there's a lot of challenges in the details of how you actually make that kind of algorithm work. And rainbow staking is one of those proposals. My writing is around 8192 signatures per slot, or one of those proposals. So there's a lot of different ideas in thinking now, and this is a very active research area. Cool. Nice. And the next one looks a little messy, but maybe you already covered it. Yes, I already answered where culture is. Yeah, yeah, yeah. How about a modularized blockchain solution? Many projects are proposing this solution, and we are also seeing ideas that the Ethereum L1 could be responsible for shared decency. What would be the broader question? What are the functions that an L1 should handle, essentially, and what should be left to individual L2s? Yeah, I think we're seeing some movements in opposite directions, right? So modularizing blockchains basically points toward a future where each individual chain does less and different components are done by different parts. And shared sequencing especially, as described by Justin Drake, one of our researchers, who is a big proponent of the Ethereum L1 doing shared sequencing. Basically, that's the vision where the L1 does more, right? So if you look at what the L1 is responsible for today, the L1 is responsible for shared security, and it's a shared settlement layer. Basically, with Ethereum, it ensures that every L2 has an ability to read any other L2 in a way that does not depend on any kind of centralized system like actors or even any validator sets, right? Because even if Ethereum gets 51% attacked, all of the L2s get reverted at the same time, right? So you even still have consistency. And then Ethereum provides data availability for rollups, but then it doesn't provide data availability for validiums. And then sequencing, so choosing the order of transactions, is currently left on a rollup-by-rollup basis, right? So the question is, what is an ideal approach in the future? And so on the shared sequencing question, I mean, I personally am agnostic. I know that some people are in favor of it, and some people think that shared sequencing is completely overrated. Basically, a lot of people, I think, would argue that, one, the benefits of shared liquidity are actually not that large beyond a certain size, right? For a regular user, it doesn't matter if market depth is $5 million or $10 million. You just need market depth to be big enough to handle your own transactions. And they would also argue that cross L2 MEV is not actually that large, right? Basically, any MEV that is arbitrage between Optimism and Arbitrum could be decomposed into MEV between Optimism and Binance, and then separately MEV between Arbitrum and Binance, right? That's the basic way to think about this. And so to the extent that that's true, then there aren't actually large gains from some central all-seeing market actor having a global view into all the L2s and choosing transactions on that basis, and it could be fine for all the L2s to be sequencing separately. But this is, I think, an open debate, and I'm content to just keep watching the debate and see how it goes. For other kinds of functions, I think we definitely want to really expand the amount of data that Ethereum can support directly, right? And in an ideal world, everything would be a roll-up, and Ethereum would not handle data availability for everything. But we know in practice that's probably not enough. Even the long-term vision of 16 megabytes per slot is probably not enough. And so the thing that I want to see is I want to see very high-security stuff being on-chain roll-ups, but then other things using various kinds of optimistic data constructions that basically do data off-chain by default, but data on-chain in specific cases where there might be some kind of problem. So Plasma is a good example of these constructions, but there's a whole spectrum of constructions that are like that. Another big one has to do with account abstraction. Basically, if your account has a state that might have to be changed, particularly if you have a key and you want to revoke the key and add a new key, if you want to change algorithms, then where does that state live? And if you have an account in 100 places, then to update your account, do you have to send that same data 100 times? And there, one of the ideas here is basically this minimal-key-store-roll-up approach where that state lives in one particular place, possibly one very usual-based roll-up on top of Ethereum, and then all of the other L2s would be able to call into it to prove with every transaction what the current state is that they're accessing. But these are still early-stage ideas, and they're being still discussed in the wallet space. So I think my view on this is very pragmatic, right? If there is a function that is important enough to handle at L1, then it absolutely should be. And otherwise, I think, leaving different room for different L2s to do different things is also very valuable. Perfect. All right, that concludes the Ethereum-related questions. Okay, let's talk about Dogecoin. No way. Come on, we want an investment. No, no, no, no. Let me call that one. No, yeah, we'll let Elon talk about the dog. You're in front of a lot of media. It's okay. All right, so security and privacy. This question is from Dan Campbell. How does ZK address the trust issue for non-technical individuals? How do they know if ZK truly achieves authentication? Yeah, and so I think the question here is basically if you have a system that gives you some ZK level of privacy in theory, then as a user, how do you know that you actually get that level of privacy in practice? And I think I see this as basically a continuation of a problem that Ethereum already has, which is that if you're putting your assets into smart contracts instead of just giving your assets to some guy, then how do you actually know that the smart contract doesn't have a backdoor so the guy can just take your funds whenever he wants? And the solutions that we have so far for that are one, we have obviously like EtherScan contract reading ability. So you can read a contract. People can publish source code. EtherScan checks the source code. And that's great as a developer tool for sophisticated users. For regular users that are not able to go and read 1,000 lines of code themselves, we've definitely been seeing wallets start to become more sophisticated. We've been seeing wallets start to give more warnings, basically show, are you interacting with a popular application? Are you interacting with some application other people have not interacted before? One thing I eventually want to see is I want to see dApp user interfaces become versioned. And so even just uploading the interface onto IPFS instead of onto a website, and then if you do that, then every single update would have to be a blockchain transaction. And you can make the authorization for that transaction just be a multi-state controlled by the team. And so there would be no server that you can hack into to force an update. And it would also, from a wallet perspective, then the wallet would be able to show you basically how recently has the site been updated, have the contents been approved and things like that. So we want to see more of those. Basically, I think we need to see more ways of aggregating the opinions of high quality researchers and auditors. And we've been starting to see that. And I do think that the wallet is going to play a very important role. I think it's going to be an important active assistance to the user in aggregating all of this kind of information. And I think all of these kinds of tools that we are using and that we are going to use for regular Ethereum are going to need to be applied basically in the same way for ZGA technologies. So in the same way as we have EtherScan, Solidity, Publish Your Source Code, we need to have EtherScan, Publish Your Circum Code, or even Publish Your Stark, Cairo Precode, or whatever the original source language is. And then it would compile and it would actually verify that the on-chain verification key matches. These are things that totally can be done. And I hope someone does them in the hackathon tomorrow. But basically take exactly the same types of tools that we've been using for Ethereum contracts and just apply them to both the ZK on-chain and to off-chain personal ZK and things like ZooPass and basically the same way. Next one, also security-related. How do you propose Ethereum contracts that have some consent of quantum-enabled transactions? What implications does it have for the broader cryptocurrency ecosystem? I think it's the same question. I mean, it's a similar question. Would there be a standard of criteria or symptoms for foundations that have already restricted or estimated the potential of those integrated functions? So one thing that is, I think, important to realize and that I think a lot of people still don't realize is that from a technology perspective, we have quantum-resistant algorithms for every single thing that is vulnerable to quantum computers. So quantum computers break existing elliptic curve signatures. We have like five different types of hash and lattice and isogeny-based signatures that are quantum-resistant. Quantum computers break elliptic curve-based encryption and stealth addresses. We have lattice-based and isogeny-based solutions for that. Quantum computers break TZG and IPA-based zero-knowledge proofs. We have STARKS. And recently, there has been actually a new paper that basically showed how you can reduce the number of rounds in the FRI protocol, which makes STARKS about two times smaller than before. So STARKS are hash-based proofs. Hashes are quantum-resistant, so STARKS are quantum-resistant. Fully homomorphic encryption. Actually, we're very lucky that the fully homomorphic encryption constructions that we have and that we had from day one were just quantum-resistant out the door. Quantum computers do not help with lattices. So in theory, it's a solved problem, but there is an important logistical challenge in going from theory to practice. And one nice thing is that I think we do have the ability to do an emergency recovery that preserves most people's funds. And I wrote about this on ETH Research recently. But we want to get from most to all, and we want to get to the point where there's something that can be done where ideally users are just quantum-resistant out the door and even the protocol itself is. And for that to happen, one is we need account destruction because account destruction has lots of benefits, but one of those benefits is that you can provide whatever signature algorithm you want as a way of verifying your accounts. So if you want to make it quantum-resistant, it can be quantum-resistant. Another is that we need the Ethereum consensus layer to become quantum-resistant. In that case, it's a bit harder engineering-wise because our current approach relies heavily on the fact that we use BLS signatures, which are incredibly efficient. And anything quantum-based is going to be less efficient, and that's a fact. And that's actually one of the reasons why I've been pushing for 8192 signatures per SWOT, for example. I think we need to give ourselves more wiggle room so that we can adjust to a less efficient algorithm if needed. And at the same time, researchers are going full speed ahead on optimizing and measuring and benchmarking post-quantum alternatives as much as possible. Yes, that's it here. All right, that concludes the security part. The next section will be industry trends. So it's a little bit like rentals. First line, AI-related. So in your opinion, what are the key benefits of integrating AI with cryptocurrency and how might we reshape the industry? Yeah, I think a lot of people have been very curious and asking about what is the AI-crypto intersection for basically the last 10 years. And I think it makes sense to ask that because at a very high level and a very thematic level, it feels like there should be. AI and crypto are two very important technology trends of our time. And there is this line that AI tends toward centralization and crypto tends toward decentralization. And those are supposed to be some kind of yin-yang complement between the two. But the question always has been can you go from having this thematic convergence to having actual examples of applications that use both in ways that make sense for both in a productive way. And in my recent post from about two months ago, I tried to analyze that question and tried to identify some concrete applications that make sense. So one of them that I talked about is AI is participating in prediction markets or in other kinds of markets on top of Ethereum. And basically you can make markets much more micro-scale and then create AIs that can play in them. Then another is AIs as a part of wallets and to help users make sense of the online and the on-chain environment that they're interacting with. A third is using cryptography including things like ZK-SNARKS and ZKML, MPC, and AI inside of FHE and things like this to try to create AI models that are secure and robust and privacy-preserving enough that it becomes safe to actually use them as a central participant in on-chain applications whether DAOs or like oracles or something else. And then the fourth is basically if that succeeds then that could be used for AIs in other areas. So out of those, I think the first two are the most obvious short term. And then the last two realistically are more speculative. So I definitely don't want to give the impression that AI or crypto applications are going to be the next great narrative that will carry the industry forward or whatever else. But I do absolutely think that these intersections are worth people looking into. Yeah, and another one is AI's role in debugging code. Basically, one of the biggest challenges of the space is bugs in our code. And one of the nice possibilities that maybe we can look forward to, we don't know yet, is the possibility that AI will make it much, much easier to basically use formal verification tools to prove that much larger sets of code satisfy certain properties. And if we do that, then we could potentially have guaranteed bug-free ZK EVMs, for example. The more that we can reduce the number of bugs, the more secure the space can become. And there's a chance that AI could be super valuable in that, too. Bug-free sounds really ambitious. And what is your opinion on these restaking fever from AV media? Yeah, restaking, I think, is interesting because on the one hand, it is potentially a very valuable way to unlock ETH that's being used in staking and make it accessible to other applications. And this is one of those things where if there's a demand and if we don't provide for that demand somehow, then there's a risk that basically the demand would get taken over by centralized actors. Because anyone can just accept people's ETH and stake on people's behalf and then issue a token and then let people restake that token, right? But that's not a world that we want to see. But then on the other hand, there definitely are risks from doing restaking wrong, like basically risks that various kinds of systemic risks end up also affecting the Ethereum validator set. And so there's some different approaches to that. And I know that different projects have been looking into it. And so far, I'm just watching the space and looking forward to seeing what comes up. Are you more active on Farkaster than on Twitter? Okay. Who here is more active on Farkaster than on Twitter? Who here is going to forecast whatever article comes out of this before they tweet it? You should. Okay. Good. Come on. We're the decentralized space. We need to use our own stuff. I think one of the interesting things with the social space is one is that it has network effects, but it also has anti-network effects. Twitter is the place where all the people are, but it's also the place where all the really annoying people are. I've found that Farkaster has already gotten to the point where there's enough people there that it's interesting. The kind of engagement that I get on there is actually higher quality. I think the other thing that decentralized social can really bring to the table is one of the big issues in the Twitter management transition and all of the discourse around it in the last few years is basically that Twitter has and had a bunch of features that were basically about trying to tell apart the high quality content in people from low quality content in people. A lot of people had critiques that these mechanisms were, including the moderation and including the blue checks, were very centralized and privileged a particular group with a particular set of opinions and biases. A lot of these non-financial Ethereum applications are fundamentally about solving that problem in a way that is more decentralized and basically trying to solve the trust problem without entrenching some centralized actor that decides who and what is good and who and what is bad. I think things like that could end up combining with Farkaster and Lens and these other protocols and we could actually get to see a very live example of seeing those kinds of techniques work. Farkaster is definitely starting to have an actual spam problem so it is something that they are going to need to work on but I'm excited to see that. I'm also excited to see Farkaster alternate clients. The big thing about Farkaster is that it's not a server, it's a chain. In principle, you can make your own clients and your clients can read or write to the same content that people using Warpcast can see and that's something that I think can absolutely be a wonderful place for people to start trying to create their own clients and trying to add interesting new features. You could imagine people putting their own community notes on Farkaster or people putting various AI-based fact-checking or even prediction market-based fact-checking or whatever into Farkaster. You could imagine people creating their own mechanisms for identifying which participants are high reputation and which participants are not for doing content prioritization. Different groups of people can have different approaches and so all of those things can be done by different clients and so people get to choose for themselves which ones they want to look at the Farkaster content through. I'm very excited to see those kinds of trends play out. The other thing I'm excited about is that I feel like like Farkaster in particular it does a good job of being pragmatic and of like actually making sure that it's easy to use enough and like it's managed to be used by non-crypto people I think in a way that a lot of other previous applications have not and I think that's like also an important success to build on and for other applications to try to replicate. Okay. Let's start using Farkaster more then there will be more things. Okay. The next one from DL News. In the face of greater institutional adoption and greater concentration of power, how can the industry stay true to its original ethos of decentralization and freedom? Yeah. I think there's several important things that the space needs to do at the same time. I think one is like basically solve the public goods problem so there's a lot of these projects that do a better job of staying true to these values but often doing that involves a lot of work that the community needs to come up with and execute on alternative ways of funding and supporting those projects and that's something that we need to get better at. That's something that the various grants giving entities including optimism RPGF including the community efforts at standardization. Basically one risk that I see is the risk that there are going to be large actors in the ecosystem that try to pull away from standards and into essentially their own walled garden and closed ecosystems and if that happens the value is going to go down. Also I think people need to keep actually executing on making real implementations of the theoretical benefits that crypto provides. For example every one of these major protocols should have an alternate graphical user implementation made by an alternate for each of every one of these layer 2s and all of the function in those layer 2s for Farkaster so Flink exists but we need more. There should be alternative GUIs for even for Gnosis safe for example before I was willing to switch my own GUI out because at the beginning the safe UI was not compatible with the status wallet but my UI was. We need to make sure that we actually support open standards for things like accessing logs which is dependent on a lot of basically making sure that we have a good relationship between preserving the ethos as much as possible and regulatory issues and mainstream acceptance and back in September I co-wrote that paper on privacy pools which basically provides an approach for how privacy protocols like Tornado Cache can be adapted to still be successful. I think we want to see more of those and more of those also not just for the financial use case but also for identity and reputation type of use cases and coming up with those pragmatic solutions and at the same time helping users to understand what level of privacy are they actually getting when they're using that. And I think another one is that we just continue to need to find ways to encourage more and different groups of people around the world to really participate in the Ethereum ecosystem as not just as users, not even just as application developers but even as layer 2 developers, wallet developers, so that we can preserve and increase Ethereum's decentralization over time. Cool, that's why we have you say hey, you want to gather more developers and more. Alright, I think we don't have much time left. Okay, well I'll do 60 seconds of questions. Go, go, go. Okay. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch. I have a watch.

Other Creators