"DxChain" wants to use "three chains in a single" to break through the storage and processing bottlenecks from the blockchain and support decentralized big data operations

 I have come into contact with a group of public chains and discussed many "interesting" consensus mechanisms and network constructions. This is actually the first time I have heard about the "three chains in a single" system architecture.


DxChainCTO Wang Wei thinks that it is difficult to meet data storage, processing and privacy needs at exactly the same time by relying on a single primary chain. Therefore, borrowing from your Lightning Network's idea of ??"the primary chain isn't enough, the medial side chain will come with each other", the storage chain as well as the computing chain are usually added. Using a side chain, the primary chain is responsible for documenting events (such as for example transactions), thereby enhancing overall network performance to support big data storage and high-speed processing.

Let me very first introduce the functions of the next two side chains:


* The storage string is in charge of keeping metadata. The function of metadata is similar to an electronic catalog. It information the extraction method of file fragments. Data files in the distributed file system beneath the chain could be retrieved through relay data.


* The calculation string is in charge of recording the complementing process of calculations, such as for example which data is called where miner, whether the work could be completed, etc. (similar to the metadata from the calculation procedure). In this way, the calculation results can be confirmed on the complete system without all nodes computation. In actual situations, only the extremely node can confirm.

Let's go through the consensus mechanism again. The DxChain primary string adopts PoW because the main chain gets the highest needs for safety and balance. PoW in addition has undergone many years of testing in the Bitcoin blockchain and Ethereum 1.0. Both aspect chains use DPoS to determine who generates the prevent, but different schemes are selected in the process of deciding who'll verify the function.


* The storage string uses a PoS+PDP (Proof Data Percesion) cross mechanism to confirm the process to avoid the next three forms of episodes: Sybil Strike, where a destructive node creates several false identities and it is controlled by several identities Policy node; Outsourcing Strike, when an attacker receives a miner's ask for to confirm whether to store data, he generates a certification from other miners and pretends he has stored the data all the time; Generation Strike, the attacker utilizes Generate data in a certain way (such as for example compressed files), and regenerate data (decompress files) when it needs to be confirmed to prove that you have completed the storage work.


* The verification procedure for the computing string adopts the initial PDC (Provable Information Computation) + "Verification Game" (Confirmation Game) method. Inside a decentralized environment, in order to confirm the authenticity of a certain result, repeated calculations are generally utilized to reduce the possibility of false information being successfully disguised. PDC is in charge of verification calculations, that may find a right answer with a small probability of becoming attacked from a band of untrusted nodes; as the "verification video game" verifies the verifiability from the calculation procedure (Truebit also utilizes the "verification video game" for review calculations) .

Let's give a good example of scenario application. The study institute wants to start a "fitness survey", "looking for samples that meet the requirements of being American, male, under 35 yrs . old, and working" is really a calculation event. According to the working rule of "three chains in a single": the primary chain transfers processing tasks to the computing string, and at exactly the same time retrieves data from your storage chain, the two side chains interact across chains to generate a fresh data set, and store it back to the storage string. And "notify" that the primary chain has completed work, and miners who supply calculations and storage will be compensated.

In terms of cross-chain discussion, DxChain selected relay technology (Relay). As the early BTC Relay could be understood as a good contract based on the Ethereum blockchain, which decentrally links the Ethereum system as well as the Bitcoin system; in DxChain, the partnership between the primary chain, the processing chain as well as the storage chain is similar.

The biggest technical hurdle of DxChain is the fact that the design of the two side chains draws on Hadoop. Hadoop is really a distributed system facilities that allows users to make full usage of clusters for high-speed functions and storage, and develop distributed programs. Among them, the distributed document system HDFS provides storage for big data, and MapReduce provides calculations for big data (do you problem).

Hadoop once provided industry standards for the development of big data services by centralized giants (such as for example Google), but it has not however been applied in the decentralized world to resolve the problem of cross-company, trustless big data processing and storage. You are that small files are limited to a 64M or 128M prevent, which is not really economical and efficient; the other is that they are stuck within a talent bottleneck, and Hadoop's PMC technical committee has less core staff. The DxChain group has Hadoop study experience and expectations to become an facilities for decentralized big data storage and computing.

DxChain chose to develop a general public chain for the big data marketplace because its founder Zhang Liang is also the founder of Trustlook, a Silicon Valley safety company. Zhang Liang within his actual company that the cost of packaged and purchased big data has been too much. Wang Wei thinks that the use of the decentralized characteristics from the blockchain as well as the introduction of a lot more nodes can decrease the granularity of big data a lot more flexibly and accurately and save procurement costs; additionally, it may prevent files from being lost or tampered in centralized storage. Thus, Trustlook released DxChain.

Wang Wei stated that DxChain expects release a the test string in 3 months and plans to launch the official network early next year. Trustlook may also be the first software on DxChain.

The team has a complete of 10 people, including 6 core engineers, 2 community providers and 2 public relations leaders; some engineers are responsible for the development of DAPP and realize the routines and needs of developers.

In terms of financing, DxChain has now completed the cornerstone circular and is in the stage of institutional fundraising.

 I'm Hao Fangzhou. If you wish to record on high-quality blockchain projects, you can include WeChat nooxika. Please be aware the business + name + reason.


Popular posts from this blog

"Ethereum Technology and Application Meeting" ended completely, let's see what the big coffee guys said

The hype threat of "hot coins" in the bear market