/ Blog /thorhead-tech-part-2


Thorhead Tech Part 2: The Scale of VeThor

In Part 1 of this series, I went over the technical details of how the VeChain smart contract powering Thorhead works. In this entry in the series, I'll go over the math behind the procedurally generated images and what that means for the global VeThor supply.

86,039,953,642 VeThor

As human beings, we are notoriously bad at understanding the true scale of large numbers. From the perspective of human history, it makes sense. As hunter-gatherers, we didn't need to understand the concept of a billion. On a survival level, it is better to accurately understand the difference between one attacking lion and two attacking lions. That's a 100% increase in the number of lions that want to eat you. That's a big deal. But if its 85 billion lions or 86 billion lions, that's effectively the same amount of danger. You're still going to get eaten.

So when we talk about 86 billion VeThor, it is borderline impossible to understand what that really means. It's a number so large that it loses meaning. So what can we do to put this into a context we can understand? What does this number represent? And what does 86 billion really mean in terms of the VeChain ecosystem?

I'm going to try and answer these questions by comparing some large numbers and seeing if we can get a sense of scale1 of both how large and, more importantly, how small 86 billion VeThor actually is. We'll start with the Thorhead project itself and then expand to more real-world, global examples.

Weighted Distribution

Attributes in Thorhead are determined using a weighted distribution. This means that for every attribute, there is a numeric weight assigned to it. Think of that numeric weight as a number of colored marbles added to a bag. The more marbles of a given color, the more likely you are to draw that color. In the case of Thorhead, the colors are the attributes and the marbles are the weights.

Let's take a look at a real example2

@background_weight_map %{
  gray: 2050,
  green: 1025,
  blue: 530,
  purple: 250,
  pink: 125,
  red: 62,
  orange: 31,
  gold: 15,
  vechain_gradient: 1
}

In this example, the gray attribute has a weight of 2050. This means that there are effectively 2050 gray marbles in the bag. The green attribute has 1025 marbles in the bag. And so forth and so on. To get the percentage chance of drawing a given color, you divide the weight of the color by the total number of marbles in the bag. In this case there are a total of 4089 "marbles" in the bag. So the percentage chance of selecting gray as the background color is 2050 out of 4089, or stated as a percentage: 50.13%. (We'll talk about this percentage a bit later.)

To do this in code, the Erlang :rand module is used to pick a "marble" from the bag. The code looks like this:

# A random number between 1 and 4089
selection = :rand.uniform(4089)

# Find the color that corresponds to the selection
@background_weight_map
|> Enum.sort_by(fn {_, weight} -> weight end, :asc)
|> Enum.find(fn {_, weight} -> selection <= weight end)
|> case do
  nil -> :gray
  {color, _} -> color
end

And there we go. We've picked a color based on a weighted distribution! Now to take this a step further, we can do bucketed weighted distributions. This is where we take a group of attributes and assign a weight to the group. This is how the skin color attribute is generated in Thorhead. The weight map looks like this:

@skin_color_weights %{
  # {weight, [options], rarity}
  base: {100, [:light, :olive, :dark], 0},
  group1: {20, [:blue, :pink, :orange], 1},
  group2: {5, [:green, :purple, :red], 4},
  group3: {1, [:gold], 12}
}

In this example, the total number of "marbles" in the bag is 126. Similar to the background color, we can use the :rand module to pick a number between 1 and 126. We then use a similar process to sort and find the group based on the selected weight. The main difference is in the last step where we select a random entry from the group. The code in the Thorhead codebase looks like this:

# A random number between 1 and 126
selection = :rand.uniform(126)

# Use Elixir pattern matching to destructure the map and grab our base options
%{base: {_weight, base_options, _rarity}} = @skin_color_weights

# Find the color that corresponds to the selection
@skin_color_weights
|> Enum.sort_by(fn {_, {weight, _, _}} -> weight end, :asc)
|> Enum.find(fn {_, {weight, _, _}} -> selection <= weight end)
|> case do
  nil -> Enum.random(base_options)
  {_, {_, options, _}} -> Enum.random(options)
end

And there we go! We've picked a skin color based on a weighted distribution. The same process is used for all attributes whenever a Thorhead is minted.

The astute reader will notice that the rarity field is not used in the code. This is because the rarity is used in the calculation of the total rarity of a Thorhead. Let's talk about that next.

Rarities in Thorhead

Each attribute of a Thorhead has a rarity associated with it. This rarity is an approximation of the odds of getting that combination of attributes. The total rarity of a Thorhead is the sum of the rarities of all the attributes. But you may be asking yourself, "Where does this number come from?" The short answer: I estimated with some math. The longer answer: I derived them up based on the "inverse" of the weight of the given attribute.

Track with me here, because this is where the mathematical rabbit hole gets deep.

Take the background colors example from above, the gray attribute is effectively the default background color. It has the highest weight and therefore the highest likelihood of being picked. So it makes sense that the gray attribute would have the lowest rarity. Now because the weights decrease exponentially (each weight is roughly half the previous weight), the rarities should also increase exponentially. So if we start with the "default" attribute being at rarity 0, we can use a slightly modified fibbonaci sequence to increase the rarity of each attribute. In the case of the background colors, the rarities are as follows:

@background_rarity [
  gray: 0,
  green: 1,
  blue: 2,
  purple: 3,
  pink: 5,
  red: 8,
  orange: 13,
  gold: 21,
  vechain_gradient: 34
]

Combining this with our usage of the weighted distribution above, we can take our percentage chance of selecting a given attribute and associate that with the rarity of the attribute.

Background Color Weight Rarity Percentage Chance
Gray 2050 0 50.13%
Green 1025 1 25.06%
Blue 530 2 12.97%
Purple 250 3 6.12%
Pink 125 5 3.06%
Red 62 8 1.52%
Orange 31 13 0.76%
Gold 15 21 0.37%
VeChain Gradient 1 34 0.02%

This chart may look familiar if you've visited the rarities page on the Thorhead dapp. But I bring up these percentage and rarity calculations because they will be essential to understanding the next section.

The Limits of Thorhead

This brings me to one of my key questions when it comes to the math behind Thorhead:

How much VeThor would it take to mint every possible permutation of Thorhead attributes?

Now the answer to this question is not as simple as it may seem. With large-scale statistics like this, we are not looking for 100% certainty because each Thorhead mint is independent of each other mint. You can only be sure of the likelihood of a given outcome. So when I say "every possible permutation", I mean every possible permutation with a 99% certainty. So how do we calculate this?

We need to first determine the highest possible rarity of a Thorhead. Referencing the table of rarities that would be:

Attribute Value Rarity
Background Color Vechain Gradient 34
Head Style Top Hat/Cowboy Hat 21
Head Height Tier 9 34
Skin Color Gold 12
Primary Color Rainbow 10
Accent Color Rainbow 10
Holographic Yes 100
Total Rarity 221

With a maximum rarity of 221, we need to calculate every other combination and the collective odds of those combinations occurring. Basically, we need to multiply each attribute's probability of being selected with each other attribute's probability of being selected, multiplied by each combination of possibilities of attributes. I'll spare you the complete detailed math (you can see the code here if you'd like to play along at home), but this set of functions lets us calculate the total number of VeThor needed to mint every possible Thorhead with a 99% certainty:

Analysis.run_example(221)

#=> Probability that total rarity >= 221 is: 2.625543403204913e-21
** (ArithmeticError) bad argument in arithmetic expression
    (thorhead 0.1.0) lib/thorhead/analysis.ex:196: Thorhead.Analysis.run_example/1

Uh oh. Turns out the number is so large that it exceeds the limits of the Erlang VM. That's a massive number (at least 260). But let's take a look at what we do see. Probability that total rarity >= 221 is: 2.625543403204913e-21. This is scientific notation for approximately a 0.00000000000000000000003% chance of getting a Thorhead with a total rarity of 221. That's a 1 in 500 quintillion chance. Aka a 500 billion billion. If we scale down the rarity to be within the limits of what the Erlang VM can handle, we get a number that is still so large that it is effectively meaningless. But let's try to put it into context:

Analysis.run_example(203)

#=> Probability that total rarity >= 203 is: 7.463676976277862e-17
#=> Number of mints needed: 41479685467187368

41 quadrillion mints needed at 100 VeThor per mint. That's 4.1 quintillion VeThor to effectively guarantee that every permutation has been minted. Compare that to the current circulating supply of VeThor: 86 billion. That number pales in comparison to the number of VeThor needed to mint every possible Thorhead. Look at it in terms of just the number of digits:

8 6 0 3 9 9 5 3 6 4 2
4 1 4 7 9 6 8 5 4 6 7 1 8 7 3 6 8 0 0

That's a whole eight orders of magnitude different. At the current VeThor generation rate, it would take roughly 3 million years3 to generate that much VeThor. But that's just Thorhead. Let's take a look at a more real-world example.

Global Scale

There are approximately 4.3 billion bottles of wine sold each year in the United States alone4. Conservatively, let's say that VeChain is able to help author digital passports for the top 1% of those bottles. That's 43 million bottles of wine that need to have data written to VeChain. If we want VeChain to be the source of truth for that data, I estimate that it would take 5~10 VeThor per transaction. That's ~220 million VeThor per year in just the US for just the wine market. That comes out to ~2% of the current annual generation rate of VeThor. If you expand that to any number of other industries or increase the market penetration of digital passports into the wine industry, you can see how quickly the demand for VeThor can outstrip the supply.

Let's translate this into USD though.

At the current price of VeThor, 200 million VeThor would be about $885,000 USD. While that may not seem like much in terms of the scale of business, that would be for one industry in one country. If you expand that to other industries at a global scale, you can see how the cost of writing data to VeChain can quickly add up. Other industries, like luxury goods, pharmaceuticals, and automotive, would easily dwarf the wine industry in terms of data written to VeChain. For example, if a single manufacturer in any one of those industries wants to author digital passports for their products, you're looking at around 3 billion data points that need to be written to VeChain.

Conservatively, that's 6~10 billion VeThor that would need to be purchased and burned to write that data. That would be ~50% of a year's worth of VeThor generation (and 6% of the circulating supply) for a single manufacturer in a single industry. If they were to buy that much VeThor on the open market, it would cost about $90 million USD per year5 and send massive shockwaves through the market. That's a significant upfront cost with a high amount of long-term risk for a single manufacturer compared to them hosting the data themselves.

Conclusion

With all of this in mind, I believe the current generation rate of VeThor is too low to make purchasing of VeThor on the open market viable. The VeChain Foundation should increase the generation rate of VeThor to lower the up front cost for enterprise adoption. My expectation for the upcoming VeChain Renaissance is to grant a significant increase in VeThor generated to those with VET staked, as well as lowering the VET staking floor. This would allow companies to minimize financial risk in purchasing and staking VET to fuel their VeThor needs and avoid going to the open market to purchase. This would make it significantly more attractive for companies to use VeChain as the source of truth for their data.

While Thorhead is a fun project that strives to educate and broaden the VeChain community, it also serves as an illustration to just how small the VeThor supply is in the grand scheme of the global economy. I believe critiquing the tokenomics from a crypto-centric perspective is missing the point. In order to continue earning enterprise trust and adoption, we must set down the relatively small scale bumps in "market value", disregard the meme-coin-driven mania, avoid the hourly pump-and-dumps, and focus on the long-term path toward global adoption. I have already proposed a few next steps that could expand the technical foundation for future adoption. It is my hope that this series has helped illustrate the scale of VeThor and the potential for VeChain to be a major player in the global push for authenticity in an increasingly digitally inauthentic world.


  1. There is an amazing site that shows the scale of wealth inequality in the US by Matt Korostoff. It's called Wealth, shown to scale. It's stunning way to try to visualize the scale of large numbers.

  2. All code examples are copy/pasted from the Thorhead codebase. Any modifications are made for clarity and brevity.

  3. (41479685467187368 - 86039953642) / 37145537 VTHO per day = 1116677877 days = 3059391 years

  4. https://alcohol.org/guides/beer-wine-production-consumption-worldwide/

  5. 6 billion VeThor * $0.015 = $90 million USD based on the average price of VeThor listed on Coinbase.


Support Kyle by sending a tip:

by Kyle