Doesn’t it feel good when something you envision actually comes to life?A couple of years ago I wrote about how a group of high school students near Dell’s headquarters in Texas took on the challenge to design the library of the future. At that time I noted how the students were happy to hear that their hard work – which was completely volunteer and did not earn a grade – was being taken seriously by business, community and school leaders.So seriously, in fact, that it has now become a reality.“The first time we walked in it looked more real than anything, like, this is what I designed. This is what came to life,” one of the students remarked. </p><p>“The plan was, do not give them any constraints. Let their minds wander, let their creativity show,” said Career and Technology Academy Specialist Rachel Sotelo at Cedar Ridge High School.That same spirit of freedom and student decision-making also translates to the Dell technology that the school uses. From laptops to desktops, Dell Chromebooks to interactive projectors, students are encouraged to choose whatever tool meets their needs.“Instead of assigning one device per student, we ask the children to think about their learning experience – what’s the purpose here, and which devices will best support me,” explained Gabi Nino, principal of Joe Lee Johnson Elementary School.That every student should have a voice and choice in the learning process is something in which our Dell Education team believes. They spend a lot of time in classrooms, talking with students and educators to design solutions that support them as the redefine the learning environment.To learn how they can help personalize learning, check out more case studies, white papers, brochures and webcasts in our online library.
In a poignant example of the power of the concept of distributed, yet collaborative, analytics in-place on a worldwide scale, Patricia Florissi, PhD, Dell EMC, will begin with an example from the healthcare industry, and then take us through a detailed discussion of the World Wide Herd (WWH), a global virtual computing cluster. To understand the concepts and the possibilities, read more in, “Distributed analytics meets distributed data.”Distributed analytics meets distributed dataTo illustrate the power of the concept of distributed, yet collaborative, analytics in-place at worldwide scale, it sometimes helps to begin with an example. In this case, I will start with an example from the healthcare industry, and then dive down into discussion of the World Wide Herd (WWH), a global virtual computing cluster.Hospitals around the world are moving to value-based healthcare and achieving dramatic reductions in costs. One way to achieve these goals is to make more effective and efficient use of expensive medical diagnostic equipment, such as CT scanners and MRI machines. When a hospital maximizes its utilization of these devices, it increases its ROI and potentially reduces its costs by avoiding the need to buy additional devices. In principle, it is contributing to more affordable care.With a focus on value-based healthcare, Siemens Healthineers, the healthcare business of Siemens AG, is developing a global benchmarking analytics program that will allow its customers to see and compare their device utilization metrics against those of hospitals around the world. The goal is to help hospitals identify opportunities to gain greater value from their investments.This global benchmarking analytics program will be offered via the Siemens Healthineers Digital Ecosystem, a digital platform for healthcare providers, as well as for providers of solutions and services, aimed at covering the entire spectrum of healthcare. The platform, announced in February 2017, will foster the growth of a digital ecosystem linking healthcare providers and solution providers with one another, as well as bringing together their data, applications and services.Global benchmarking analytics in the Siemens Healthineers Digital Ecosystem will be powered by the innovative Dell EMC World Wide Herd technologies, enabling the Internet of Medical Things (IoMT) for several healthcare modalities. Dell EMC’s collaboration with Siemens delivers the ability to analyze data at the edge, where only the analytics logic itself and aggregated intermediate results traverse geographic boundaries to facilitate data analysis across multi-cloud environments—without violating privacy and other governance, risk and compliance constraints.How it works The WWH concept, which was pioneered by Dell EMC, creates a global network of Apache™ Hadoop® instances that function as a single virtual computing cluster. The WWH orchestrates the execution of distributed and parallel computations on a global scale, across clouds, pushing analytics to where the data resides. This approach enables analysis of geographically dispersed data, without requiring the data to be moved to a single location before analysis. Only the privacy-preserving results of the analysis are shared.Let’s take a closer look at how the WWH enables distributed, yet collaborative, analytics at a global scale. First, WWH distributes computation across a virtual computing cluster and pushes analytics to its virtual computing nodes. In the case of Siemens, each virtual computing node is implemented by a cloud instance that collects and stores data from Siemens’ medical devices in local hospitals and medical centers.Second, computation takes place, in real-time, where the data resides.Third, only the privacy-preserving results are sent back to the initiating location, where they are aggregated, and a global analysis is performed on these results. In the case of Siemens, each virtual computing node calculates a local histogram and sends it back to the initiating node, which combines all histograms together to provide global benchmarking. A hospital administrator looking at the global histogram can immediately gain insights on the performance of this one hospital compared to all the other hospitals in the world.A WWH can have multiple configurations. The virtual computing nodes can be clouds in a multi-cloud environment or an Internet of Things (IoT) gateway in a multi-IoT gateway environment, where analytics is pushed directly to the gateways themselves.In its ability to pair distributed processing and analytics with distributed data, the WWH overcomes several pressing IT issues. It helps organizations address the challenges of:An explosion in the numbers of connected devices and the volumes of IoT data that defy the scalability of centralized approaches to store and analyze data in a single locationBandwidth and cost constraints that make it impractical to move data to central repositoriesSecurity concerns for data in transitRegulatory compliance issues that limit the movement of data beyond certain geographic boundariesThe bigger pictureWhen you study these and other challenges, you see that we are in the middle of a perfect storm that is disrupting the status quo. Increasingly, we need to take the processing power and analytics to the data, rather than vice-versa. This is very much the future for many industries as we look to a world that is projected to have 200 billion connected devices in 2031. Data will increasingly be inherently distributed and inherently federated with limited data movement.While the example I have used here focuses on a specific use case in the healthcare industry, the WWH concept can be applied across a wide spectrum of industries. In an earlier blog post, I explored the potential to use a WWH to advance disease discovery and treatment by enabling global-scale collaborative genomic analysis research. And, of course, WWH approaches can and will be used to help companies gain value from data spread across the IoMT and IoT in general.At the end of the day, rich insights can be obtained when the domain of the data analyzed transcends geographical, political, and organizational boundaries, and can be analyzed as one virtual cohesive dataset. That’s the World Wide Herd in action.For a closer look at the Siemens Healthineers Digital Ecosystem and its many partners, visit com/healthineers-digital-ecosystem.To explore Dell EMC solutions for data analytics challenges,visit com/BigData.Patricia Florissi, Ph.D., is vice president and global CTO for sales and a distinguished engineer for Dell EMC.
ManageabilityDD MC on KVM: DD MC can now integrate with your neighborhood LDAP systems. This is about making things easy while using DD MC. Our customers wanted DD MC on open source software, and now with Data Domain OS 6.2 customers can install and run DD MC on KVM. With the expansion of the cloud ecosystem to support Alibaba Cloud and GCP, DD MC now also supports these additional cloud providers giving you broader visibility. SecuritySecure LDAP for User Access: LDAP can be used to authenticate users who can manage a Data Domain OS system. LDAP, in previous versions of Data Domain OS, has been used to setup NFS. This now extends to user access. Serviceability and TCE ImprovementsSupport Bundle Size Management: Ability to configure files and data to be collected and sent home. This reduces the payload, time and network usage for events that do not require a full dump of the logs.Core Size Management: Reducing the amount of data sent during significant events to only what is required to triage the issue.In summary, Data Domain OS 6.2 is a very feature packed release that brings a lot of value to our customers. We hope that you will consider upgrading your Data Domain appliances with the latest Data Domain OS 6.2. To learn more about Dell EMC Data Domain, check out the first blog in our series, view the Dell EMC Data Domain home page, read Data Protection blogs, and follow @DellEMCProtect on Twitter for our latest announcements and content.Further Resources Video: Dell EMC Data Protection Appliance Launch BroadcastBlog: Dell EMC Data Protection Appliance AnnouncementInfographic: Dell EMC Data Protection AppliancesThe third and final blog in this series will focus on enhancements to the Data Domain DD3300 appliance – a small, compact 2U appliance very apt for the mid-sized organizations. Check back on March 6th for more information or talk with your Sales Rep for the full story today! This is blog #2 in our series and will focus on enhancements to Dell EMC Data Domain, a leader in Purpose Built Backup Appliances (PBBA).What a start to the year 2019! Dell EMC Data Domain delivers another feature packed software release (Data Domain OS 6.2) with hardware component enablement. The key themes for this release are:Dell EMC Data Domain DD3300 Enhancements: We acted on the feedback from our customers related to DD3300, and are proud to announce faster networking, support for VTL use case over Fiber Channel, and more flexibility from a capacity upgrade perspective by introducing a new DD3300 model at 8TB that can grow-in-place up to 16TB, and provides a clean upgrade path to 32TB on a single box.Increased Differentiation: We made significant strides in improving performance – single stream restore performance has improved up to 4X compared to the previous release (Data Domain OS 6.1.2), and so has the overall restore performance. We have expanded our Cloud data ecosystem for both Dell EMC Cloud Tier and Dell EMC Data Domain Virtual Edition (DD VE) with support for Google Cloud Platform and Alibaba Cloud. In addition, DD VE also supports AWS GovCloud and Azure Government Cloud.Simplified Experience: Better management through automation and end-to-end data protection visibility with Dell EMC Data Domain Management Center (DD MC)Serviceability and TCE enhancements: Get more from your Dell EMC Data Domain appliancesPurpose Built Backup Appliance (PBBA) Features We have made major strides with respect to performance – improving overall restore and single-stream restore performance. We can now run Dell EMC Data Domain Management Center (DD MC) on KVM, and also support LDAP for user access.PerformanceRestore and Recall Improvements: Overall restore performance improved across both tiers, active and cloud. No user intervention is needed; it just comes automagically with Data Domain OS 6.2. Upgrade your Data Domain appliances with Data Domain OS 6.2 and you’ll experience better performance.
Artificial intelligence will have a transformative impact on our world, and it’s only just the beginning. At Dell Technologies, we’re helping our customers simplify and drive data science and AI initiatives that can deliver valuable insights, automation and intelligence to fuel innovation across their IT landscape — from edge locations to core data center and public clouds.With more than 48 percent of CIOs deploying AI this year, it is increasingly becoming a strategic priority for organizations across industries, sizes and geographies. Yet, deploying and managing AI workloads can be complex and time intensive, requiring extensive hardware/software integration and testing.To ease this transition and remove complexity, Dell has developed new solutions to help data scientists and developers get their AI applications and projects up and running without delay. Our latest solutions, all globally available today, to help prepare IT environments for AI include:Dell Cloud-Native Machine Learning with One Convergence® DKube™One Convergence DKube deep-learning-as-a-service software installation delivers the ease of cloud-like use on-premises to reduce cost by abstracting the underlying accelerated compute and storage. This helps remove operational headaches often encountered with deploying the infrastructure on-premises, especially for key tasks such as model development, training and deployment. This desktop-to-datacenter AI system – which can run on both workstations and servers – automatically recognizes resources for use and can be scaled out by adding nodes to the cluster.Dell EMC HPC Ready Architecture for AI and Data AnalyticsThe new Dell EMC HPC Ready Architecture for AI and Data Analytics delivers the power of accelerated AI computing to the edge with an easy-to-deploy cloud native stack. This combination of the Bright Computing® Solution for Edge, the Dell EMC Data Science Portal, PowerEdge servers with NVIDIA GPUs and Isilon scale-out NAS storage enables faster access to data gathered at any location. It also reduces IT silos and consolidates operations for up to three times lower total cost of ownership (TCO) versus running AI, data analytics and HPC workloads on different systems.Dell EMC Ready Solutions for Data AnalyticsDell EMC Ready Solutions for Data Analytics introduced two new validated architectures today to help customers use AI and data analytics to quickly gain insights from their data through a choice of pre-tested software and hardware configurations based on different use cases. Dell EMC Ready Solutions for Data Analytics – Spark on KubernetesCustomers can now speed up large-scale batch and streaming data processing with Apache® Spark™. Built on the successes of Hadoop, Apache Spark is a unified analytics engine that performs data processing in-memory instead of on disk allowing workloads to run 100 times faster. Dell EMC Ready Solutions for Data Analytics – Splunk Enterprise Customers now can have the power of Splunk® Enterprise to get real-time insights and business value from machine data, one of the most underused organizational assets. Based on Dell EMC PowerEdge servers, the architecture provides high performance and low latency I/O.Dell Precision Data Science Workstation Guided Install EditionPreparation and model training are often the most time-consuming portions of a data scientist’s role. Dell recognizes this and designed the Dell Precision Data Science Workstation portfolio to help data scientists focus on experimenting, exploring and uncovering insight, rather than maintaining AI systems and waiting for model training iterations to complete.Customers can maximize productivity, speed up processes, yield better quality insights, and lower the cost of projects by running models on a preconfigured, pre-validated Data Science Workstation. Data scientists no longer need to choose between enterprise-class AI platforms or AI platform flexibility.Based on Dell’s Precision 7000 Series mobile workstations and the 5820 and 7920 tower workstations, the portfolio features NVIDIA® Quadro RTX™ 5000, RTX 6000 and RTX 8000 GPUs. The systems are certified NVIDIA NGC-Ready to help data scientists, developers and researchers quickly deploy AI frameworks with containers and get a head start with pre-trained models or model training scripts. In addition, the Data Science Workstation portfolio is optimized for NVIDIA Data Science Software powered by RAPIDS™, including GPU-optimized XGBoost, TensorFlow, PyTorch and other leading applications.Dell Precision Data Science Workstation and Dell EMC Isilon scale-out NAS H400 solution for data scientistsAI initiatives generally start small with a data science problem and a proof of concept. This new data science and modeling offering takes advantage of the combined power of the Dell Precision 7920 Tower Data Science Workstation and Dell EMC Isilon scale-out NAS. With this solution, data scientists can build their models on workstations while training these models using data residing on fast, high-performance scale-out shared storage. They can then move to production seamlessly without switching systems or migrating data. With this approach, data scientists won’t need to bring in the data sets locally for training, further increasing productivity and reducing model training time.Dell EMC Isilon scale-out NAS for Deep Learning WorkloadsDell and NVIDIA have teamed to show how to accelerate and scale deep learning training workloads using the combination of Dell EMC Isilon, Dell EMC PowerSwitch and NVIDIA DGX-2™ systems with NVIDIA V100 Tensor Core GPUs. Following this reference architecture, businesses can deploy faster, achieve greater model accuracy and accelerate business value with AI at scale. Dell EMC Isilon scale-out NAS is also available in an engineering-validated reference architecture design with NVIDIA DGX-1™ systems, simplifying and accelerating the deployment of enterprise-grade AI initiatives.Regardless of where, or how, your organization employs AI as part of its digital transformation journey, Dell Technologies can help create an AI-ready technology environment. Click here to find out more.