Gensyn
Gensyn is the hyperscale, cost-efficient compute protocol for the world's deep learning models. It is the future of deep learning training.
Gensyn protocol trustlessly trains neural networks at hyperscale and low cost. The protocol achieves the lower prices and higher scale by combining two things:
Novel verification system: a verification system which efficiently solves the state dependency problem in neural network training at any scale. The system combines model training checkpoints with probabilistic checks that terminate on-chain. It does all of this trustlessly and the overhead scales linearly with model size (keeping verification costs constant). If you’d like to go deeper into how this is technically possible.
New supply: leveraging underutilised and underused/unoptimised compute sources. These range from presently unused gaming GPUs to sophisticated Eth1 mining pools about to detach from the Ethereum network. Better still, the protocol’s decentralised nature means it will ultimately be majority community governed and cannot be ‘turned off’ without community consent; this makes it censorship resistant, unlike its web2 counterparts.
Vastly increasing the scale of accessible compute, whilst simultaneously reducing its unit cost, opens the door to a completely new paradigm for deep learning for both research and industrial communities.
Improvements in scale and cost allow the protocol to build up a set of already-proven, pre-trained, base models–also known as Foundation Models–in a similar way to the model zoos of popular frameworks. This allows researchers and engineers to openly research and train superior models over huge open datasets, in a similar fashion to the Eleuther project. These models will solve some of humanity’s fundamental problems without centralised ownership or censorship.
Cryptography, particularly Functional Encryption, will allow the protocol to be leveraged over private data on-demand. Huge foundation models can then be fine-tuned by anyone using a proprietary dataset, maintaining the value/privacy in that data but still sharing collective knowledge in model design and research.