Categories
developer documentation v0.0.27
mimik Developer Documentation
  • FAQs
  • FAQs: Hybrid edgeCloud Platform

FAQs: Hybrid edgeCloud Platform

What is edge computing?

Edge computing is a distributed information technology architecture in which client data is processed at the periphery of the network, as close to the originating source as possible. It pushes the computational activity of applications, data and services away from a cluster of centralized server-side nodes.

What is the difference between edge and cloud computing?

Edge computing refers to data processing power at the edge of a network instead of holding that processing power in a cloud or a central data warehouse. The name “edge” in edge computing is derived from network diagrams; typically, the edge in a network diagram signifies the point at which traffic enters or exits the network.

What is the major advantage of the central cloud?

The major advantage of the central cloud architecture is the rapid and low-cost deployment of computing and/or storage-intensive applications on generic servers shared amongst many applications with the aid of virtualization and orchestration technologies.

What are the disadvantages of the central cloud?

The disadvantages of the central cloud are as follows:

  1. The current hierarchical architecture makes central cloud resources and network connectivity the bottleneck for future growth. Sending data from hundreds of billions of client devices to a large number of centralized cloud servers wastes bandwidth and energy.
  2. Another disadvantage of central cloud architecture is the developer's reliance on cloud service providers that have access to the apps and the data stored or processed in their servers. As a result, today, a handful of very large companies have control over the vast majority of consumer and enterprise data.
  3. Despite all the sophisticated security measures, storing data and hosting applications on third-party resources exposes the owners of the information to risks of exposure. Cloud resources have been designed for easy access to millions of developers and application service providers which in turn has increased vulnerabilities and security holes.

What are the advantages of hybrid edgeCloud?

According to the fundamental principles of Hybrid edgeCloud architecture, much of the computational processing is performed at the edge; communication is kept as local as possible. Also, edge nodes can collaborate and share computing and other resources in close proximity.

The benefits of such an architecture are as follows:

  1. reduced cloud hosting costs
  2. reduced communication bandwidth and network efficiency
  3. reduced energy consumption and carbon emission
  4. reduced latency
  5. reduce application development time
  6. promotes the use of microservices as the basic deployment unit for computation resources
  7. increases data privacy
  8. gives consumers and enterprises better control over their data.
  9. better scalability
  10. minimizes transport costs and latencies for applications

Is the Hybrid edgeCloud antithetical to using a central cloud?

No

The “central cloud”, as defined as servers in data centers, remains a valuable resource in the modern enterprise. The central cloud can be indispensable for many applications that require central storage or processing. Data center resources supporting a central cloud may need to increase capacity, but at a reasonable pace to accommodate the needs for central processing only. All the other possible tasks and functions should be delegated to edge nodes where most of the data is generated today.

When a Hybrid edgeCloud is in force, servers in data centers will no longer be a bottleneck or the “always necessary” trust elements in an enterprise architecture. As a result, resources in the central cloud do not need to grow in proportion with edge nodes but only in proportion to the needs of central processing as dictated by the relevant use cases and associated applications.

What is distributed edge computing architecture?

A distributed edge computing architecture means that network aware devices are a collection of independent nodes do not need the central cloud in order to operate. These independent nodes can process data independently and communicate with each other directly, sharing resources and collaborating in a dynamic hierarchy.

A distributed edge computing architecture differs from today’s cloud computing ecosystem. In a centralized cloud computing architecture all nodes are connected to the cloud. Therefore, most nodes transmit raw data back to the cloud and communicate through the cloud in a fixed hierarchy that does not enable nodes to share resources and collaborate independently.

What is the difference between edge and fog computing?

Fog computing’s processing efforts are focused at the local area network end of the chain whereas edge computing pushes these efforts closer to the data source. Hence, each device on the network plays its own role in processing the information, instead of using a centralized server for the majority of processing.

What are the main advantages of edge computing?

The main advantages of edge computing are as follows:

  • Real-time or near real-time data analysis as the data is analyzed at the local device level, not in a distant data center or cloud far from the point where the data is created.
  • Lower operating costs due to the smaller operational and data management expenses of local devices versus clouds and data centers.
  • Reduced network traffic because less data is transmitted from local devices via a network to a data center or cloud, thereby reducing network traffic bottlenecks.
  • Improved application performance as apps that can not tolerate latency can achieve lower latency levels on the edge, as opposed to a faraway cloud or data center.

What are some of the applications of edge computing?

Edge computing is ready for deployment now in almost all industries. For example, fitness centres can take advantage of edge computing to enhance their members’ experience by turning their fitness equipment into edge devices. Thus, it's possible to connect a piece of equipment with the user’s mobile device and wearables to monitor the users’ fitness progress and engage with them at the right time.

Self-driving cars are another example where edge computing is useful. These cars generate about 1 GB/sec of data. It is not feasible to send all this data back to the cloud continuously. Edge computing has the potential to transform cars to data centers on wheels where most of the processing is performed locally. Edge-enabled cars can communicate on a peer-to-peer basis hence reducing bandwidth consumption and latency. Imagine two self-driving cars about to crash. They need to make decisions quickly and hence the latency of a central cloud system is not feasible. Edge computing supports the instantaneous processing of information within the vehicle. Edge computing enables cars rapidly decide when to break, speed up or change direction, as well as communicate directly will all the cars nearby.

Moreover, with edge computing, you can network all devices inside a car. Passengers can simultaneously connect multiple devices to the car’s infotainment unit, and create a cross-device jukebox that can easily and directly share content. Edge computing even allows taxi drivers to securely offer internet connection to passengers.

A third example of using edge computing is in our homes. Edge computing can turn a set-top box (STB) making it available to a cloud server. The benefits are enormous: better cross-screen media sharing than is available on the current Airplay or Chromecast systems; users can deploy and launch services such as smart home quickly and reliably, and there is potential to group STBs and share resources.

In addition to the examples described above, there are many more use cases: connecting all electronic gadgets and appliances directly, allowing device manufacturers to harness the collective power of the devices deployed, enabling new features in social media applications, connecting drones, turning devices such as mobile phones to sensor hubs used in agriculture and mining to collect and process data and even share resources across devices, etc.

How does the edgeEngine work?

The mimik edgeEngine is a collection of mimik software libraries and corresponding APIs. Developers can use them to efficiently solve the fundamental challenge of networking nodes in the new hyper-connected and highly mobile world of distributed edge computing. edgeEngine can run on any mobile device, fixed gateway, autonomous car gateway, connected TV or even in the cloud, depending on the application use case.

edgeEngine runs in a heterogeneous environment, regardless of OS, manufacturer, and connected network is a non-trivial challenge. Once the edgeEngine is downloaded onto a device, it becomes a cloud edge node.

mimik edgeEngine resides between the operating system and the end-user application. There are several microservices available from mimik and the edgeEngine SDK provides ability for 3rd parties to develop their own microservices. The runtime environment for microservices is also provided by mimik edgeEngine.

edgeEngine makes it so computing devices are transformed into intelligent network nodes that can then be formed into clusters.

mimik edgeEngine takes away the complexity of networking among distributed edge cloud nodes, enabling developers to focus on their solution in a microservice model even on small mobile devices.

The mimik edgeEngine provides native class wrappers (or API wrappers) for all supported platforms, in the interest of shortening the learning curve and accelerating development. These are available for:

  • Android, developed in Java
  • iOS, developed in Objective-C
  • Linux, Windows or Mac OS X, developed in C++

Why should developers integrate mimik edgeEngine?

In a nutshell, mimik edgeEngine enables developers to rapidly build exciting new applications by turning computing devices into edge cloud servers.

edgeEngine provides Service Discovery, Connection and Communication between nodes both on the physical and microservice level and the benefits are countless:

  • Automatic device discovery with no need for extra signalling or control.
  • Ability to create a micro-cloud cluster network of edge devices via local WiFi without internet.
  • Node-to-node file sharing and beaming (casting) of content from within any app.
  • Advanced peer-to-peer networking without the hassle of low-level network setup or programming.
  • Microservice runtime environment for many platforms including mobile devices.

What differentiates mimik from similar companies?

Mimik is a distributed edge cloud software platform that turns heterogeneous computing devices into independent edge cloud servers. In this sense, mimik has developed and launched the first-ever cross-platform SDK for a heterogeneous edge cloud.

The essential differences compared to companies trying to offer the same services is that they have several limitations. They either are limited to specific operating systems such as Android or iOS, require particular devices as edge servers (for example, just PCs and not smartphones), only offer media sharing, do not have microservice runtime environment or even support, and most of them do not have an SDK for developers.

What is the difference between mimik SDK and Amazon Web Services (AWS) Greengrass?

The main difference is that AWS Greengrass nodes do not discover other nodes at the edge. Both Azure IoT and Greengrass need manual configuration. For example, for each node to become part of a Greengrass group, it has to be loaded with the Greengrass IoT client software manually. Since there is no app-store or application bundle to download/install easily, Greengrass has to be registered manually with the AWS backend and authorized via credentials generated through the AWS console as well as saved securely in the IoT device. All this has to be done before it can connect to its local gateway, which is also a manually configured device. mimik edgeEngine does all the above automatically.

Azure or AWS Greengrass IoT are perfect candidates for an IoT microservice hosted within a mimik Edge node. Using Azure or AWS Greengrass IoT enables connecting “IoT islands” to mimik Edge clouds (clusters) based on scope, enabling the formation of application and use-case specific IoT network overlays on top of a physical, disjointed network.

Was this article helpful?

© mimik technology, Inc. all rights reserved