Edge computing is the critical architecture for enabling most 5G use cases. However, requirements and expectations from edge node in terms of computing resources is in question, and with the growing number of devices and data it could become cumbersome.
L8istSh9y Podcast and Calsoft Inc. collaborated and had a discussion on the value of NFV for Telecom and Edge Computing. The guests invited were Pavan Gupta and Kiran Divekar, Software Architects, at Calsoft Inc. the podcast was hosted by Stephen Spector (Senior Director Digital Marketing, EdgeGravity by Ericsson) & Rob Hirschfeld (CEO, RackN).
This infographic covers basics and need for NVMe; market drivers for NVMe, ecosystem, forecasts
HA feature plays an important role in Disaster Recovery System (DRS). We can interconnect more than one cluster which is located at different geographically location. In case of any physical damage, if one site goes down then another can take responsibility to serve the data and handover the remaining task once another site recovers from failure.
This blog explains where cloud-native approach in NFV been this year and where it’s headed.
it is clear that most vendors are focused towards utilizing public cloud or hybrid cloud environments to archive the long-term data. Use of the hybrid cloud means that private cloud can be used to store data, which is bound by compliance and security norms critical to organizations.
This article talks about how Kubernetes has emerged from container orchestration platform to manage complex workloads in AI and Machine Learning Stacks, Managing containers in NFV architecture and handling hardware GPU resources.
IoT offers some pretty interesting applications in making our lives easier in healthcare, transportation and agriculture. IoT represents a new revolution of the internet and has potential to change the world.
Open dedupe is feature rich open source dedupe mechanism which provides support for local as well as cloud storage. And it can be very easily scaled out. Open dedupe is great fit for various data protection product, long term retention, virtualization, scale out storage’s. It also works with share point.
DPDK (Data Plane Development Kit) came along. It said, "from today on wards there is going to be just one agency, itself, following just one rule book, a dpdk library", irrespective of what the requestor's end goal is. All packets, needing network functionality processing (again, it means routing, switching, firewall allow/deny etc),