This is not true. Kasplex (a third party entity and dev group) developed and built out the current L2 on Kaspa.

There are several other dev groups that are working on L2s and SC. It is an open source protocol.

For you to look at 1 project, the first of its kind, and then determine that “Kaspa” can’t do smart contracts is nonsensical. Development takes time. If you want to study self-executing, native SC, look at Sparkle.

Reply to this note

Please Login to reply.

Discussion

Come on, it's not Turing complete, it lacks smart contract functionality on its base layer, it's a UTXO model with a limited scripting and that's baked in to the core of the core. There is no math that allows a chain like that to get "upgraded" to Turing complete.

Not that layers 2 are a bad thing, but when we're talking smart contracts on Kaspa we're talking the equivalent of Rootstock and whatnot.

Everything lacks something until is doesn’t. And it doesn’t need to be Turing complete in order to operate trustless smart contracts on the base layer.

Smart contracts as we know them need a Turing complete chain. Yes with a UTXO model like Kaspa you could do P2PK, or multi-sig this-n-that, which, technically yes, those are smart contracts, but that's just wordplay.

For real smart contracts on Kaspa you need the Layer 2, there's no getting around it.

Quai is Turing complete and fully supports smart contracts on the base layer. If the goal is a very fast proof-of-work chain with smart contracts directly on the base layer then Quai offers this now, today. No matter how good Kaspa's layer 2 is, it's all still much messier.

This is patently false. A smart contract is any code enforced on-chain that manages state transitions automatically based on rules.

Turing completeness is not required for that. Your Quai comparison isn’t apples-to-apples. Quai’s determinism is risky at best and is not wholly POW.

We all know what is meant when we hear someone use the term "smart contract". There is an implied level of complexity. Saying that Kaspa can do smart contracts on the base layer is like winning a court case on some arcane technicality.

Give me an example of the most complex smart contract that Kaspa could ever process on the base layer and I guarantee you it'll be something so simple that most people would be surprised to learn it's technically a smart contract, like some M of N multi-sig or what have you.

You can nitpick on Quai, but there's no arguing that a smart contract on its base layer is a *real* smart contract.

Just like everybody guaranteed 7 years ago POW couldn’t scale on the base layer, huh?

POW scaling on the base layer is not breaking math, it's fine tuning. Performing Turing complete operations on a non Turing complete chain is plain old breaking math.

I mean the whole premise of something like Kaspa is "our layer one is so good we don't need a layer two". But then it turns out you do in fact need a layer two for these key use cases, which kind of undoes the whole marketing.

Anyway Quai is right there, it has smart contracts directly on the L1, it's very fast, it's proof of work, why would anyone choose Kaspa L2 over Quai L1?

You think a POW, with a revolutionary directed cyclic graph structuring, with a block speed of 10bps’ whole marketing is base layer smart contracts? Yikes.

To answer your question, Quai isn’t POW consensus. You need to do more homework.

Also, there’s also no hard cap on Quai tokens. I don’t like long standing inflation, no thanks.

Read about Sparkle for Kaspa. You don’t know that a L2 is necessary. They are still developing it.

Are you trying to say there's a way to do Turing operations in a non-Turing-complete environment? Cause that's what it sounds like.

There is not. Quick look at this sparkle show's all rollups and custom zk-opcodes. So Layer 2 stuff. Sparkle has nothing to do with making the base layer magically Turing complete.

Kinda like the RGB folks saying "Now we have smart contracts on Bitcoin". No, no you do not. You have a ZK layer 2, just like all the other ZK layer 2s on all the other chains.

Right but the whole point of Kaspa is that it's a really fast PoW layer 1. That's the entire raison d'etre.

Sure it (and many other layer 1s) can function as a ZK settlement layer, but the point of Kaspa never was for it to be a ZK settlement layer. That's a massive pivot. I mean if all the activity moves to zk sqeuencers off chain the what was the point of making it easy for activity to happen on chain?

Either Kaspa sticks to its original mission of being an "all you need" layer 1, forget smart contracts altogether, or it it becomes yet another fish in the zk settlement layer pond.

Go watch what Sutton said Vprogs are. No liquidity silos. He did an interview with XXIM. Sompolinsky is saying the same thing.

I get the gist, our team does zkapps (o1js, noir, cairo) and there's not a lot new under the zksun.

At the end of the day it's logic executed off-chain by a dedicated prover. It always is. Whether you use a sequencer and call it an L2 or use a so-called vprog and call it an extension to the L1, it's still some outside CPU taking a long time to prove something and then yeeting that proof on back.

Honestly, for ZK, my view is that you need a ZK stack top to bottom. Mina was too early but that is the right path, the entire Mina chain reduces to a 22kb recursive snark, you can verify anything proven in the entire history of the chain on a iPhone in100 milliseconds. You just need that 22kb snark and whatever zkapp state proof you got sent to you and that's it. So a super-fast ZK sequencer rolling up to a ZK-native layer like Mina, or some high speed ZKnative L1 that emerges in a few years, this stuff all makes a lot of sense.

It would be impossible for a Kaspa node to verify Vprog ZK-proofs on an iphone like a Mina mobile rust node can. For Kaspa the best you can do is an SPV kind of deal, trusting a cluster of full nodes or a centralized RPC endpoint, and anyway on an iphone an SPV will get stopped once the app goes to background.

There are projects taking the Mina learnings and coming out in the next few years that will be the future of ZK. It'll be ZKnative top to bottom. (I'm in Asia so I'm biased, but I think ZK for the next 10 years is all about mobile.)

Kaspa is just not ZK-native. You can do a similar trick to these Vprogs (minus the DAG flourish) on Solana, but Solana is not ZKnative either. And Aztek rolling up to Eth, okay Noir is nice, but Eth is not ZK-native either. All of these suffer from the same dissonance and can't be the ZK future.

Solana is for old-fashioned smart contracts. Kaspa, like Bitcoin, is for money. My thoughts anyway.

There wasn’t a whole lot new under the sun for POW, until Kaspa. And here we are at 10bps (everybody said it couldn’t be done). I’m not sure getting “the gist” means you can create something nobody else has been able to. Clearly Sutton and YS disagree with you.

I don't think they would disagree at all. They would agree that Kaspa is not a ZK-native layer 1 like Mina or whatever follows Mina is. Because it's not.

But they'd say there's still a use case for yeeting ZK proofs back to Kaspa, which there is.

I'm arguing that use case is quite limited in the grand scheme, and that in the end the some future ZK-native layer 1 will win. Vertical integration always wins.

The crux here is that Zkapps are not smart contracts in the sense that all computation is executed by every node on the chain. There are an outside thing. When yeet a ZKproof to chain, the main thing people have to then be able to do is verify the proof. If not then what was it for?

If verification is hard, or slow, or clumsy, if it requires trust in an RPC call, or running a node, or an SPV, or whatever else, then it'll ultimately fail as a system, because that's all too much of an ask of the verifier.

If they can remove the issue of liquidity silos, what’s your issue with what’s being done?

My issue is on the verification side. These are ZK proofs, create them any way you like -- but they have to be verified by random people. Or what are you doing?

One reason Kaspa team are going the ZK route is because they cannot do on-chain computation (non turing complete), so they are outsourcing to the end-user's CPU or GPU, or to dedicated proving servers. Which is fine, ZK is useful. But then random people have to verify the computation that those CUPs and GPUs did.

So verification matters, and the whole liquidity silos things doesn't touch on verification at all, that's a whole other thing.

Let's say that the proof is that a passport showing age over 18 was seen at such and such a time by such and such a website. Creating the proof is and validating it on chain is one thing. Trustless verification by random people that need to check this fact is another thing. Kaspa best you can do is old-fashioned call an RPC. Not trustless. To truly verify the state themselves, they would need to run a full Kaspa node, which is, like, big. A full Kaspa node needs to maintain the entire state of all vprogs it cares about to run and re-verify the logic locally, or at least be able to access the witness data and state commitments.

Whereas with proofs on a ZK-native layer 1 anyone can do trustless self-verification on a cheap android phone from10 years ago. This is why the ZK-native layer 1s will win out in the end.

Why do you need to access the witness data if it’s cryptographically verified? That can’t change.

that comes after an "or". it's either or. but a single kaspa node would in any case have to maintain the state commitments of every signle vprog across the whole thing going back to the pruning point, whatever that is for kaspa. otherwise makes no sense. if the pruning point is only a few days back and that's out the window then you need an archive node, that's crazy heavy for an auditor that just needs double check in a trusted way (i.e no RPC call) that someone who visited a website last week was over 18.

so it fails on that end.

You don’t need archival nodes to verify what’s cryptographically verifiable. Nobody looks at the history and visually verifies UTXOs on a scale of any magnitude.

Ok so let's say you're a compliance auditor. You've been sent by the team a merkle path n' leaf set relating to a website login for a user who supposedly passed an age check, and this includes some user details. You need to run that against Kaspa to see if it checks out, and an RPC call won't do. This suspicious login is from one week ago.

Explain to me how, without an archive node, you do this.

Why is the onus on Kaspa to regulate an age check? This issue is way before Kaspa. Kaspa is only settling what’s been determined/or neglected prior. That’s like saying the internet is responsible for a 15 year old getting on a porn cute.

I don't think you're understanding what ZKproofs are. Kaspa is not regulating anything. It's just doing math.

The ZK proof in such an example would be a mathematical proof that the government-private-key-signed data from the chip inside the passport was scanned by the device (which has its own cryptographic record relating to the nfc scan and loading into memory and so on). All it is is math. The significance of the math is what humans agree on, but once that's agreed on you're gonna need to run occasional checks to make sure the math checks out (and thus that the significance of it checks out).

You can extrapolate that to anything a kaspa vprog would do. It's all the same thing, proofs of whatever math underscores whatever thing of interest (often a transaction but not always).

But end of the day, Kaspa is just not designed to do this kind of verification efficiently and trustlessly. This is all an afterthought for Kaspa. So it stands to reason it's not going to be super efficient at it.

Why would the math break? Why the occasional checks?

Because for ZK the data of "what happened" is typically off chain. It's not like an old-fashioned smart contract at all, where all of that data is on chain. So the parties will share that data with each other privately. And then the receiving party can use what is on chain (the merkle root) to determine the "mathematical truthfulness" of what they've been passed. Those are the checks.