Capitalize on the fall in cryptocurrency mining

Check out all the Smart Security Summit on-demand sessions here.

While it's misguided to scoff at the rapid fall of cryptocurrencies, serious opportunities are emerging as a result. For those unaware, crypto miners over the past few years have purchased just about every high-capacity GPU available on the market. This drove up prices and reduced availability to the point where even major cloud providers couldn't get their hands on current models.

Combined with Moore's Law, this has led to a situation where the average GPU hardware used for anything other than crypto is several years old and probably four times less powerful than normal market conditions could support. But it has also led many software companies to avoid optimizing their products for the GPU. So, on average, the software you're using is probably ten times slower than it should be.

This is probably the biggest market opportunity in a generation, and smart companies should now be looking to exploit it. Speeding up your word processor or spreadsheet ten times is unlikely to unlock major business value. But there are several important areas that will.

Analysis of data and database systems

The most obvious area is database systems, especially those that run on Big Data. The digitization of the world as a whole has not slowed down, and as a result, systems built on legacy databases are struggling these days to keep pace. This isn't always apparent to end users as a database issue, but usually manifests as extremely slow screen refresh rates or stuck busy cursors.

Event

On-Demand Smart Security Summit

Learn about the essential role of AI and ML in cybersecurity and industry-specific case studies. Watch the on-demand sessions today.

look here

This was somewhat mitigated by moving to cloud computing with automatic horizontal scaling (additional processors added). However, as data volumes become very large, the process of moving data between systems and between CPU enclosures becomes rate-limiting. The result is non-linear yields, where doubling the applied calculation only gets you, for example, 50% more speed.

The implicit response of most companies in these circumstances is, essentially, to stop even looking at all the data. For example, you can aggregate hourly to daily or daily to monthly data. Under normal operating conditions with well-understood data, this may be fine. However, this comes with some risk, as modern data science techniques require access to primary granular data in order to generate a fundamental type of insight: anomaly detection.

Anomalies can be good or bad, but they are rarely neutral. They represent your best and worst customers and your company's best and worst responses. They include issues of high trading risk and also rewards. Solving a technological limitation by ignoring outliers is therefore inexpensive and derisory.

A classic example might be utilities, which until recently - and sometimes still - used 1km resolution data to monitor tree and forest fire risk. A single pixel in such a system can have 1,000 healthy trees and one dead tree. But it only takes one tree to strike a power line to cause a wildfire large enough to bankrupt a major utility.

The business risk, in this case, is hidden in decades-old data collection decisions under even older database technology, but it is very real nonetheless. And today would be a very good time to start addressing it, as sources and methods have evolved rapidly over the past five years and have generally not exploited GPU analysis or new hardware.

Capitalize on the fall in cryptocurrency mining

Check out all the Smart Security Summit on-demand sessions here.

While it's misguided to scoff at the rapid fall of cryptocurrencies, serious opportunities are emerging as a result. For those unaware, crypto miners over the past few years have purchased just about every high-capacity GPU available on the market. This drove up prices and reduced availability to the point where even major cloud providers couldn't get their hands on current models.

Combined with Moore's Law, this has led to a situation where the average GPU hardware used for anything other than crypto is several years old and probably four times less powerful than normal market conditions could support. But it has also led many software companies to avoid optimizing their products for the GPU. So, on average, the software you're using is probably ten times slower than it should be.

This is probably the biggest market opportunity in a generation, and smart companies should now be looking to exploit it. Speeding up your word processor or spreadsheet ten times is unlikely to unlock major business value. But there are several important areas that will.

Analysis of data and database systems

The most obvious area is database systems, especially those that run on Big Data. The digitization of the world as a whole has not slowed down, and as a result, systems built on legacy databases are struggling these days to keep pace. This isn't always apparent to end users as a database issue, but usually manifests as extremely slow screen refresh rates or stuck busy cursors.

Event

On-Demand Smart Security Summit

Learn about the essential role of AI and ML in cybersecurity and industry-specific case studies. Watch the on-demand sessions today.

look here

This was somewhat mitigated by moving to cloud computing with automatic horizontal scaling (additional processors added). However, as data volumes become very large, the process of moving data between systems and between CPU enclosures becomes rate-limiting. The result is non-linear yields, where doubling the applied calculation only gets you, for example, 50% more speed.

The implicit response of most companies in these circumstances is, essentially, to stop even looking at all the data. For example, you can aggregate hourly to daily or daily to monthly data. Under normal operating conditions with well-understood data, this may be fine. However, this comes with some risk, as modern data science techniques require access to primary granular data in order to generate a fundamental type of insight: anomaly detection.

Anomalies can be good or bad, but they are rarely neutral. They represent your best and worst customers and your company's best and worst responses. They include issues of high trading risk and also rewards. Solving a technological limitation by ignoring outliers is therefore inexpensive and derisory.

A classic example might be utilities, which until recently - and sometimes still - used 1km resolution data to monitor tree and forest fire risk. A single pixel in such a system can have 1,000 healthy trees and one dead tree. But it only takes one tree to strike a power line to cause a wildfire large enough to bankrupt a major utility.

The business risk, in this case, is hidden in decades-old data collection decisions under even older database technology, but it is very real nonetheless. And today would be a very good time to start addressing it, as sources and methods have evolved rapidly over the past five years and have generally not exploited GPU analysis or new hardware.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow