边缘云白皮书(英文版).pdf

返回 相关 举报
边缘云白皮书(英文版).pdf_第1页
第1页 / 共25页
边缘云白皮书(英文版).pdf_第2页
第2页 / 共25页
边缘云白皮书(英文版).pdf_第3页
第3页 / 共25页
边缘云白皮书(英文版).pdf_第4页
第4页 / 共25页
边缘云白皮书(英文版).pdf_第5页
第5页 / 共25页
点击查看更多>>
资源描述
2017 AT&T. All rights reserved. PAGE 1 OF 25 AT&T Edge Cloud (AEC) - White Paper AT&T Labs & AT&T Foundry AT&T Edge Cloud (AEC) 2017 AT&T. All rights reserved. PAGE 2 OF 25 Table of Contents I. Executive Summary . 4 II. Background Cloud Computing Evolution and the Rise of the Edge Cloud . 6 III. AT&T Edge Cloud Architecture . 11 IV. Edge Computing Drivers . 13 V. Edge Location Tradeoffs . 13 VI. Edge Computing Use Cases . 14 VII. Co-existence of Centralized Cloud and Edge compute . 15 VIII. Edge Computing Key Requirements . 16 IX. Virtualization Infrastructure Manager (VIM) . 17 X. Scaling Edge Functions Using Cloud Native Computing . 19 XI. Orchestration & Management . 21 XII. Open Source Eco-system Edge Computing enablers . 21 XIII. Conclusion . 22 XIV. Glossary . 22 XV. References. 24 AT&T Edge Cloud (AEC) 2017 AT&T. All rights reserved. PAGE 3 OF 25 AT&T Edge Cloud (AEC) 2017 AT&T. All rights reserved. PAGE 4 OF 25 Executive Summary In recent years, there has been a concerted effort among all companies to move their infrastructure to a centralized cloud, enabled by virtualization. This push started with the vision of reducing time to market for new services and achieving lower total cost of ownership (TCO). This surge in the demand for cloud computing led providers like Amazon and Google to build massive centralized clouds (think data centers) designed for efficiency.  With the emergence of new technologies such as augmented and virtual reality, autonomous cars, drones and IOT with smart cities, data is increasingly being produced at the user end of the network. These use cases demand real-time processing and communication between distributed endpoints, creating the need for efficient processing at the network edge. “Edge computing” is the placement of processing and storage capabilities near the perimeter (i.e., “edge”) of a providers network.  Edge computing can be contrasted with the highly-centralized computing resources of cloud service providers and web companies.  Edge computing brings multiple benefits to telecommunications companies:   reducing backhaul traffic by keeping right content at the edge,   maintaining Quality of Experience (QoE) to subscribers with edge processing,   reducing TCO by decomposing and dis-aggregating access functions,   reducing cost by optimizing the current infrastructure hosted in central offices with low cost edge solutions,   improving the reliability of the network by distributing content between edge and centralized datacenters,   creating an opportunity for 3rd party cloud providers to host their edge clouds on the telco real estate.  The computational resources can be distributed geographically in a variety of location types (e.g., central offices, public buildings, customer premises, etc.,) depending on the use case requirements. This variety requires flexibility in the hardware and software design to AT&T Edge Cloud (AEC) 2017 AT&T. All rights reserved. PAGE 5 OF 25 accommodate constraints around power, space, security, and other elements. Therefore, edge computing needs to support any type of compute, storage and network (e.g., peripherals, uCPE, COTS, white box, etc.) AT&Ts Domain 2.0 (D2) initiative has always followed a hybrid cloud deployment model to keep latency sensitive network and service functions closer to the edges of the network and move non-“real-time” applications to centralized data centers. This type of planning and deployment method is well-known, and has been adopted by most global telco operators.  This model can also be adapted to edge computing and edge-friendly workloads. We will more than double our D2 AIC compute capacity in 2017. These new edge applications need to be geographically close to users, on a typical scale of hundreds to multiple thousands of edge cloud deployments. At that scale, cost-effectiveness is key, along with native support for acceleration and peripherals.  There are multiple edge open source and standard initiatives (e.g.,  ONAP i, Open Stack ii, ONF vii, CNCF iii, ETSI MEC iv, OPNFV viii, Open Compute Project ix, xRAN x, 3gppxi, etc.,) that are converging to create an ecosystem that will support edge computing and edge services. The purpose of this whitepaper is to articulate the business and technical benefits of edge computing, describe edge technologies and solutions for suppliers, and identify emerging potential business opportunities with 3rd party cloud providers. Further, we establish a consortium proposal for global commonality such that operators, open source communities and standards bodies can realize next generation applications such as augmented and virtual reality (AR/VR), self-driving vehicles, IOT, and more.  AT&T Edge Cloud (AEC) 2017 AT&T. All rights reserved. PAGE 6 OF 25 Background Cloud Computing Evolution and the Rise of the Edge Cloud Cloud resources have democratized access to computing and dramatically increased the pace of disruptive innovation Clayton Christensens concept of the innovators dilemma has been an axiomatic principle throughout the rise of enterprise and consumer software markets i. In his 1997 book, Christensen describes the process by which a novel product or service can initially take root in an underserved fringe market, but then eventually grow to out-perform and disrupt the incumbent players in established markets as well (Figure 1). The incumbents especially larger, less agile enterprises are thus faced with the dilemma of allocating sufficient resources to maintain their competitive edge against potential disruptors while simultaneously increasing performance along the dimensions that their mainstream customers have historically valued.ii Though applicable to almost any industry, this concept has been incredibly relevant in the rapidly-evolving software and technology space.i  Over the last decade, public cloud providers have dramatically increased both the pace and the impact of disruptive innovation by allowing third party services to reap the benefits of geographic presence and economies of scale without deploying and maintaining their own costly infrastructure. Public clouds have contributed to the success of new business models facilitating the disruption of the television and movie industry, the disruption of the hotel industry, etc. iii and are rapidly attracting established companies as well.iv Figure 1: Impact of disruptive innovation upon new and existing markets iii AT&T Edge Cloud (AEC) 2017 AT&T. All rights reserved. PAGE 7 OF 25 However, the emergence of the public cloud has been revolutionary beyond just the startup and software communities; it has also introduced a new paradigm by which large-scale enterprises can thrive in an increasingly fast-paced, software-driven economy. Public cloud companies have leveraged the sizeable investments made toward their own computing infrastructure to create pay-as-you-go service environments for third-party players.v By powering some of the most innovative companies in the world with their Infrastructure as a Service (IaaS) model, these cloud providers are becoming important participants in any new technology trend and effectively sharing in the financial growth and success of each of their customers.i And while incumbents in the software and service space are increasingly disrupted, cloud infrastructure players have maintained a steady presence.v Thus, a large enterprise can do more than merely avoid being disrupted by new technologies. By leveraging its infrastructure capabilities as a service to enable the growth of emerging ecosystems, it can in fact share in their success. The public cloud was itself a disruptive initiative. When the first major cloud provider initially launched, their service significantly underperformed the private data centers that were the gold standard for established enterprises. iv However, rather than competing for these mainstream customers, they gained traction by designing toward underserved markets, identifying new customers and use cases that were ill-suited for existing infrastructure paradigms. Over time, the offering adapted and grew dramatically, eventually leading even large IT companies relinquish their private data centers. iv We believe AT&T must also design toward new, underserved use cases as we evolve our business models and infrastructure into the 5G era.  The IT-Networking Convergence Has Unlocked the Power of the Cloud As technology has evolved and converged, the flow of information has become streamlined, driving compute and storage off personal devices while simultaneously enabling them to become more powerful and intuitive with each subsequent generation. However, computing revolutions are not a byproduct of the device technology alone the entire ecosystem and infrastructure must be developed to support these seismic shifts (Figure 2).  In the 1990s, telecommunications companies (telcos), which previously offered primarily dedicated point-to-point data circuits, began to offer Virtual Private Network (VPN) services. Rather than building out physical infrastructure to allow for more users to have their own connections, telcos were now able to provide users with shared access to the same material resources. Because these operators could optimize resource utilization toward overall bandwidth usage efficiency, they could offer the same quality of service at a fraction of the AT&T Edge Cloud (AEC) 2017 AT&T. All rights reserved. PAGE 8 OF 25 cost. In these early stages, the term “cloud” was used to represent the computing space between the provider and the end user. In 1997, Professor Ramnath Chellapa of Emory University defined cloud computing as the new “paradigm where the boundaries of computing are determined by economic rationale rather than technical limits alone.” This has become the basis of what we refer to today when we discuss the concept of cloud resources.vi  Figure 2: History and future projections of the IT-Networking convergence As computers became ubiquitous, engineers and technologists explored ways to make large-scale cloud computing power available to more users through time-sharing. They experimented with algorithms to optimize the infrastructure, platforms, and applications to prioritize computing resources and increase efficiency for end users. In the 2000s, major movers in the cloud arena pioneered the concept of delivering enterprise-level applications to end users via the Internet.vii Over time, the IT-Networking ecosystem has become increasingly adept and dynamic. Cloud data centers have become more powerful and geographically distributed, and networks have become dramatically more efficient in order to support them. As a result, remote application servers have become easier to access, bringing more information, media content, utility applications, and realistic forms of communication directly to the consumer.  Content Delivery Networks Brought Static Content to the Edge 1 9 9 0 s  2 0 0 0 s  2 0 1 0 s  2 0 2 0 s  D o m i n a n t  R e m o t e  A p p l i c a o n  S e r v e r  E n t e r p r is e  S e r v e r s  E d g e  C lo u d  D e v i c e - C l o u d  N e t w o r k  I n t e r f a c e  D ia l- U p  D S L  4 G  L T E  +  W L A N  5 G  +  W L A N  D o m i n a n t  P e r s o n a l  C o m p u n g  P a r a d i g m  P C  L a p t o p  S m a r t p h o n e  H e a d - m o u n t e d  D is p la y  A p p a r e n t   D i s t a n c e  D e c r e a s e d  b y  I m p r o v e d  N e t w o r k  E f f i c i e n c y  V ir t u a l P r iv a t e  N e t w o r k s  C e n t r a liz e d  C lo u d  +  C D N s  AT&T Edge Cloud (AEC) 2017 AT&T. All rights reserved. PAGE 9 OF 25 Content Delivery Networks (CDNs) play a critical role in the seemingly immediate content access that users experience today by establishing Points of Presence (POPs) that store localized caches of content geographically nearer to end users. By hosting these large content files at the edge of the network, CDNs are also able to alleviate performance variance, core network congestion, and high operating expenses for massive data transmission.  Due to the increased streaming demands of the past few years, large content providers have explored models to further improve performance and reduce delivery cost. By establishing commercial agreements directly with local and regional network operators, they can place their own CDNs within the operator network itself, eliminating the delays induced by transferring to an external InP. These large CDN providers are continuously expanding their geographic presence.iii The Emergence of the Compute-Driven Edge Cloud  While CDNs significantly improve user access of static, predefined content such as videos and web pages, they are not designed to be application servers for dynamic content such as AR/VR, real-time analytics, etc. CDNs essentially operate as caches for content files that they pull from their central origin servervii and are not equipped with the requisite computing infrastructure to dynamically generate content streams.viii Pseudo-dynamic (sometimes referred to as event-driven) content can be distributed with CDN architecture by breaking streams into small segments and caching multiple possible versions of each segment in event-driven lookup tables.ix For example, adaptive bitrate streaming is accomplished by encoding several versions of the same video file encoded at a variety of data rates. The CDN can then adjust to network fluctuations by delivering appropriate stream segments based on the inp
展开阅读全文
相关资源
相关搜索
资源标签

copyright@ 2017-2022 报告吧 版权所有
经营许可证编号:宁ICP备17002310号 | 增值电信业务经营许可证编号:宁B2-20200018  | 宁公网安备64010602000642