Why Do We Provide Edge Computing Devices?

In September 2021, AI inside started the provision of “AI inside Cube Pro,” the highest performance in the series of our in-house designed edge computing device “AI inside Cube.”

For AI inside to realize a society in which AI has spread to every corner of society, “AI inside Cube” is an essential element that composes the platform for anyone to create and use AI easily. As a company that provides AI services, we will introduce why we design and provide in-house developed edge computing devices in addition to software, and the future that we are aiming for.

“Edge AI” gaining the attention

There are two primary environments to run AI; cloud computing service and edge computing service. Cloud computing service is when AI training and inference is processed in the cloud, and “cloud AI” refers to running AI in the cloud. On the other hand, an edge computing service is to process AI near the device (edge), and the action to run AI training and inference on edge computing are called “edge AI.”

The use of the cloud was the mainstream for AI training due to the need for processing large amounts of data, however recently the total addressable market of edge AI computing in Japan is forecasted to be 56.45 billion yen (4.2 times more) by FY2025 and is expected to expand opportunities for use. * Source: Fuji Chimera Research Institute, Inc. “2020 Artificial Intelligence Business Survey.”

One of the reasons for the market’s expansion is that even companies and organizations that cannot provide data to outside parties or upload it to the cloud from the perspective of handling highly confidential data can still utilize data with edge AI. Another advantage of edge AI is that there is it reduces the time to wait for a response after sending the data to the cloud.

The need for in-house development for business innovation

When a company develops and implements AI, in most cases they will provide the training data to the AI vendor or developer company, outsource the development of AI models, and conduct proof of concept (PoC) based on the developed AI. Most companies take this flow because a high spec machine is needed for AI training, and most users do not have the equipment or skills.

However, in recent years, for companies to transform their business to create more value, they need to promote DX (digital transformation) with speed and rationality using ICT technologies, including AI. If we continue with the standard of outsourcing AI development to external companies, we will face challenges in terms of speed and volume of response. To cope with these issues, there is a growing need for companies to build in-house systems which is capable of AI development and operation.

An environment for anyone to easily create and operate AI

The “AI inside Cube Pro” which was announced in September, is an edge computing device with the highest performance in the series and can complete AI training and inference in-house, as opposed to AI development, which was mainly done in the cloud.

In addition, at the same time as the announcement of “AI inside Cube Pro,” we have also started the provision of “Learning Center,” an AI development service to create highly accurate AI without code, in an on-premise environment.

By utilizing both “Learning Center” and “AI inside Cube Pro,” we now have the method to create AI in-house and the environment to run AI in-house, making it possible to internalize AI development and operation for your work.

The most important feature for the “Learning Center” is that even front-line staff who do not have knowledge of programming can develop AI, verify the effect, and conduct additional training (update) on their own, according to the issues and understanding of the field.

The typical AI development steps and the steps for “Learning Center”

The future AI inside aims, A society where AI is born in a distributed manner

By designing and providing our edge computing device, AI inside is promoting the formation of a new standard in which anyone can have the environment to train AI, regardless of the environment. Furthermore, by making training and inference of AI possible without being aware of the limitations and characteristics of cloud and edge computing, it will be possible to utilize data that was not used in the past, thus increasing the possibility of new AI utilization.

AI inside believes that by providing an environment where “anyone can easily train AI,” many people will be able to express their creativity, and anyone will create a wide variety of AI in a distributed manner. AI inside will continue to provide AI platforms and services with excellent UI/UX.

Contact for in-house development of AI:




This is the official account of AI inside Inc.

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

What are the risks and benefits of artificial intelligence?

MAFAT Radar Challenge: Solution by Axon Pulse

Learn How to Make a Chatbot Without Coding Knowledge

Machine Learning Techniques

What is Artificial Intelligence? How AI Works?

Perils of Injecting Correlation in Decision-Making


Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
AI inside Inc.

AI inside Inc.

This is the official account of AI inside Inc.

More from Medium

AIML: a Tool, not Technology

Why ModelOps is an enterprise-level capability under the CIO’s accountability

Quo Vadis, CoRisk-Index?

Will robots ever be able to think like humans?

can robots think?