dellites.me

Dell TechCenterBI Summit – 8th Year of Attendance! Lucky Number 8

As the week of BI Summit in Las Vegas approaches, I find myself doing analytics for this show.

  • When was the first time I went – in 2006 
    • What were people were talking about - Corporate Performance Management (CPM) and business intelligence (BI) together as innovations.
    • Where was I - I was working for Business Objects (NOT a division of SAP, yet) and we were driving a message that illustrated the value of enterprise performance management and enterprise business intelligence.
    • Fast forward to 2014 – last year’s Summit
      • What were we are now talking about - diversity of data, Hadoop and the creation of the data lake/reservoir, and social media data driving behavioral changes
      • I now work for Dell Software as a Subject Matter Expert (SME), and enjoy having a complete information management and analytics software portfolio to help our customers attack and solve big data issues as well as all data issues.  Poised on the cusp of the data economy, Dell’s software solutions provide value focused solutions on data to more effect.

Readying myself for this next trip, I find myself reflecting and marveling at what I am looking forward to during the conference.  I would like to share my musing for just a moment…

Big Data for Big Effect - For this year’s show, I want to see what the esteemed analysts are encountering with customers.  We have been talking about big data for more than 5 years, and the sense that I have from interactions with my customers is that big data has finally matured – from hype to reality.  It has shaped up to be different from how we originally thought it would be.  Everyone has big data issues, not necessarily all 4 V’s (Velocity, Volume, Variety, and Veracity)  at the same time but clearly they have many V’s in different areas of the organization.  I am finding that my customers struggle with Variety and Veracity more than Velocity and Volume.  With in-memory and new styles of data stores (graph, columnar) volume and velocity seem to be more manageable.  The other two V’s tend to be more challenging and elusive to solve – but we do have a few different strategies and suggestion that seem to resonate with customers. I am curious to see what the analyst community is seeing and saying.

Visualization BI Sweetheart no more – As amazing as it may seem, but appears that the viz biz now has wall flower status.  This year at the show, I hope to learn how customers are making the different BI and analytics technology mesh together.  Clearly, visualization vendors are co-existing with all the different BI and analytics platforms.  In fact, Analyst Rita Sallam has a session that talks about interactive visualizations for everyone (Session #A3 on Tuesday).  I am curious to see how organizations are achieving enterprise BI or analytics insight while managing multiple platforms and diverse eco-system of tools.  This will be an indicator of many things with the customer base, adoption, complexity, and willingness to change cultural practices. 

Citizen analyst and chief data officers – New titles, new roles?  I am hearing clients invest in new human assets – new titles and new jobs but what are their responsibilities?  Everyone seems desperate have to have analysts, data scientists, mathematicians.  Apparently – math is the new *** and data scientist now have a head Honcho – the Chief Data Officer (CDO).  So what are all these new roles that are emerging in the market place?  I plan on attending all the sessions that I can to see if the market has started the process of defining and understanding the responsibility of these personas.

Post show, I plan on being back and sharing what I learned at this pre-eminent BI and analytics show.  Stay tuned for insight and understanding.

If you are attending the show with us…please join us at the any of the following locations/events:

Dell Software Booth #425

Customer Session - Dell Software: Actionable Analytics in Biotechnology - Keys to Success

SPS3 10:45 AM - 11:30 AMMonday, March 30

Hospitality SuiteDell Software: Jedi Masters and Padawans– Join Dell Software IMG Analytic Masters

HS3 6:00 PM - 8:00 PMTuesday, March 31

Hope to see you there!

Jo

Dell TechCenterCalit2 is Using HPC to Unlock the Secrets of Microorganisms

A look at how HPC is being used to help researchers study the 100 trillion microorganisms in the human body.(read more)

Dell TechCenterCalit2 is Using HPC to Unlock the Secrets of Microorganisms

A look at how HPC is being used to help researchers study the 100 trillion microorganisms in the human body.(read more)

Dell TechCenterInternet of Things Unlocks the Power of Data in a Connected World

Lately I’ve been spending a lot of time thinking about the impact of the Internet of Things (IoT) on nearly every aspect of our daily work and personal lives. Today, our phones, homes and cars are getting smarter thanks to a slew of embedded sensors and geolocation capabilities. IoT also is making major inroads at work, as reinforced by a recent Dell-sponsored EMA survey that revealed 47 percent of organizations polled view IoT as essential or important to their business.

Snippet of infographic on the Internet of Things prepared by EMA for Dell

Some pundits say IoT is a revolution, but I see it more as an evolution because data already exists in many forms, but the challenge is getting the right data at the right time to the right person. To take full advantage of IoT, organizations need to first clear some major stumbling blocks.

In the infographic below that illustrates some of the findings of our survey, we defined some of the many reasons for the IoT hold-up. The top five in particular are:

  1. Lack of an integrated team to make use of all the IoT data
  2. Privacy issues with data produced by end devices
  3. Quality and reliability of data from end devices
  4. Connectivity throughput to end devices
  5. Undefined business case for the user of device information

To illustrate the importance of lowering these impediments, let’s take a look at the connection—or rather, disconnection—between IoT and disaster preparedness. Clearly data generated by IoT can play an important role in helping prepare and respond to natural disasters.

During a recent IoT hackathon to help disaster victims, six university students convened at Boston-based high-tech company PTC and went back in time to New Orleans in 2005 when Hurricane Katrina hit. They then tied together an IoT-centric disaster relief plan comprising aerial maps, a cellphone app streaming real-time data, an unmanned aerial vehicle and remote sensor data—all of which was streamed into a centralized command and control center.

By integrating all the data from the connected devices, the students were able to create real-life scenarios, so rescuers could quickly reach the most severe trauma cases and bring them to the correct facilities. As the hackathon proved, IoT makes it possible to take advantage of connected devices to create a super system with a virtual command center. In doing so, you can create spontaneous collaboration, consensus and coordination—with a single mission and common purpose, resulting in a better outcome.

With IoT, virtualized, cloud-based command centers can be deployed much faster than their brick-and-mortar counterparts. People can check in and out via their mobile devices, and subject matter experts can share insights transparently and instantaneously. The effectiveness of all efforts can be monitored in real-time, making it possible to change course or escalate efforts immediately. Using solutions like Dell Statistica, information can be gathered and culled to drive deeper understanding of the meaning behind the data.

If IoT had played a bigger part in the “blizzard of 2015,” perhaps the National Weather Service (NWS) could have saved face and New York City could have saved $200 million in lost economic activity. Instead, the NWS was harshly criticized for forecasting that a potentially historic blizzard would dump up to 30 inches of snow on NYC without communicating the uncertainty of their prediction. Meanwhile, Big Apple officials were slammed for shutting down schools, roads and public transportation for less than a foot of snow.

According to The Washington Post, flawed forecast information and failed communications were the root of the problem. Not only did the NWS fail to adequately communicate the uncertainty in what was an extremely complicated forecast, they also fell short in providing any scenarios beyond the worst-case, which caused a wave of panic to ripple across the Northeast corridor.

It’s highly likely that IoT supplies the weather service with enough data to produce minimum, maximum and most likely snow fall amounts to guide local forecasts. But the failure to aggregate, integrate and share this vital information led to a black mark on the NWS and possible loss of public trust in weather forecasts.

I bet that IoT could help the NWS regain public confidence in its overarching mission to protect life and property. I’m also sure that as IoT continues to evolve, it will become a prevalent part of everyday life, enabling us to collect, visualize and analyze data in new and exciting ways. In addition to providing first responders with virtual command centers that deliver actionable insight when every second counts, I envision a day when IoT will unlock the power of all kinds of data to change how we work and live.

Where does IoT fit into your grand scheme of things? Drop me a line at Joanna.schloss@software.dell.com to continue the conversation.

Infographic about the Internet of Things prepared by EMA for Dell

Dell TechCenterIn Backups As In Life, You Get What You Pay For!

Most companies rely on Microsoft Exchange Server, and if your company is one of them, there’s a solid chance that downtime will cause you major problems.

In a previous article we talked about tips to protect Microsoft Exchange, we talked about offline  backups, a somewhat primitive backup method that has substantial limitations but still might be the right choice in some situations.

Today, let’s look at ways to achieve a smarter Microsoft Exchange backup. Let’s look into the pros and cons of another option, the Native Windows Server backup.

The Pros
As the name suggests, Native Windows Server backup is included with Windows Server. It gives you minimal data protection capabilities, including a baseline ability to back up the full Exchange data store.

Here’s one way it’s superior to offline backups: it doesn’t require you to take the database offline before backing it up. You can continue to use Exchange during the backup and you won’t have the hassles of narrow, restrictive backup time windows.

Here’s one more item for the “plus” category: Native Windows Server backups can be used to back up both physical and virtual machines.

The Cons
But, as you might have guessed, there are some drawbacks to the Native backup method (since it’s included in the purchase price of Windows Server, it fits the old adage “you get what you pay for”): the technology involved puts a lot of strain on the server and can cause performance to plummet.

So even though technically you don’t need to do your backups at night when Exchange isn’t active, in reality, a lot of organizations have to.

And here’s another potential drawback: Native Windows Server backup doesn’t let you restore individual items like deleted email messages.

There are other drawbacks you probably want to think about:

  • Recovery Speed – If you need the ability to recover data quickly, the native backup option isn’t for you. Recoveries are time-consuming. This is NOT the situation you want when the office manager is fuming and breathing down your neck because “my email’s not working”! 
  • Data Risk – Like its more primitive cousin, the offline backup, the native backup option can leave your data exposed to risk. If the server overhead means you have to run backups at night, a full day’s data is left not backed up until that night.
  • Off-site Storage – With Native Windows Server backups, your data is not automatically replicated off-site. You can rotate your backup tapes as part of an off-site storage plan, but if you do this as your primary form of data protection, it can be a real pain to manage; it can lengthen data recovery time as tapes are retrieved from the offsite storage location.

There are plenty of drawbacks and limitations you’ll want to keep in mind as you’re considering the native backup method. You’ll want to choose carefully. All things considered, it may or may not be the right option for you.

Just like we recommended if you are taking a look at the offline backup option, when you’re thinking about a native backup solution, consider it from every angle. Decide carefully if it’s right for your situation.

For more tips and information, see the eBook, Six Ways To A Smarter Microsoft Exchange Backup, which evaluates six different approaches you can take to protect your Exchange data. It can help you determine the right approach for your company. Download your free copy now!

 

 

Ryan M. Garcia Social Media LawYES PLeaSe: A Legal Guide To Periscope And Meerkat

meerkat-vs-periscopeFor the first time in several years we have some significant new entries to the social media application world in the form of Meerkat and Periscope.  Both of these applications allow users to quickly and easily provide Personal Live Streaming (PLS), meaning they can start shooting video and instantly sharing it on social media.  No shooting video and uploading to YouTube/Instagram/Vine, this is an ongoing live stream complete with user interaction.  In all likelihood this is a function that other platforms can provide as well, especially as our handheld technology continues to grow in processing power and our wireless bandwidth continues to grow.  But for now these are two significant players in an emerging space that come with some intriguing legal issues.

After experimenting with the two applications, including an hour-long live cast of my podcast (all about geek culture, if you’re interested you can check out the podcast on iTunes or our website) I put together this quick look at some of the high level legal concerns for brands and organizations who are thinking about getting involved with PLS.  Is it too much to say as brands develop a Go-To-Meerkat strategy?  It is?  Sorry.

Because I’m a lawyer there are, of course, three main risks to be concerned about.  And, oh, how convenient, they spell out YES so we can make a great blog post title.  Those three concerns are YouTube+/-Engagement, and Saved Streams.  Okay, I guess technically that would spell YESS but that sounds reptilian and I’m trying to avoid that easy lawyer joke.  So YES it is.

Because professional courtesy.

Because professional courtesy.

Also please know this is a highly dynamic area.  Meerkat was first to market but Twitter had already acquired Periscope and was preparing its own launch while Meerkat was getting tons of press at SXSW.  So Twitter cut off some important access to Meerkat (both apps use Twitter for crucial functions).  This kind of activity may continue for others that try to create a similar service on the backbone of an existing one, and we’re sure to see completely independent services start up that tout their protection from such antics.  But in a new field with this much attention we are bound to see significant moves in functionality and usage over the next several months, so stay tuned for additional posts on the subjects.

YouTube+/-

Personal Live Streaming is a video stream and so it carries most of the same legal concerns as any video content an organization would post on YouTube.  But the live component of PLS makes for some interesting additions and subtractions to your standard legal analysis of video content.

On the plus side, or additional analysis you should do, you will need to consider the environment in which the stream will be recorded.  Since these streams go out live you will not be able to review them for their content prior to publication.  That video your marketing team did with that catchy, unlicensed Top 40 hit?  Yeah, you can review that before it goes on YouTube so that Marvin Gaye’s estate doesn’t sue you for $7 million but you can’t review it before it streams.  So the environment and context of the video stream should be considered for any legal threats with the team putting the stream together–you won’t get a chance to fix it later.  Consider copyrights, trademarks, privacy concerns, licensing issues, and please at least briefly discuss defamation law with your on-screen talent/broadcaster.

On the minus side, or some mitigating factors that YouTube doesn’t traditionally have, these streams are not intended to be permanent.  Risky activity could be mitigated by the fact that the videos are generally only visible while they are being created (except see our third part, Saved Streams, below).  If someone on camera says “Top Hollywood Celebrity explicitly endorses Company Product!” during a live stream, hopefully the live and non-recorded nature of the film could mitigate any potential rights of publicity claims (or at least damages).  By the way, don’t invite that streambomber to your next livestreams.

Unless it's a dolphin.  Dolphins can photobomb or streambomb all they want.  It's the law.

Unless it’s a dolphin. Dolphins can photobomb or streambomb all they want. It’s the law.

Engagement

Both apps provide similar ways to engage with stream watchers.  Stream watchers can like a stream or send a comment to the broadcaster and those watching.  Both apps also have no moderation abilities at this point–so if someone starts spamming your video broadcast with explicit text or spam there is nothing you can do.

One crucial way in how the apps differ on engagement is the comments.  Meerkat comments are sent via Twitter–they are sent as Twitter replies to the original tweet announcing the Meerkat broadcast.  This can be both good and bad in terms of monitoring and recording the comments and in who can see the posted comments.  Periscope comments are limited to the video stream itself, also with its own benefits and drawbacks.  One consideration organizations should make when using PLS is whether they will have an individual conduct the streams or a small team.  The single user and video shooter can be very effective and personal, but it can also be difficult to engage an audience based on personal content (a speech, a demonstration, etc.).  Having one person operating the camera (well, phone/tablet camera) while another is being filmed will help to monitor video issues and comments, or you may even want to separate the duties between people to operate the camera and another to watch the comments.  There’s no right answer, it’s just something to think through.

Unless you have one of these.  Because now you have extra fingers to use and you are awesome.

Unless you have one of these. Because now you have extra fingers to use and you are awesome.

Saved Streams

PLS is mostly about current video but both apps have some replay abilities that may bring legal risks or make you consider which application your organization may conduct its own experiments.  Meerkat streams are public and they had to issue a quick fix recently to prevent anyone from hijacking another user’s stream.  That security issue aside, Meerkat faces another legal risk in terms of recorded sessions.  Meerkat gives broadcasters the option to save the video to their phone/tablet at the end of the session but there is already a service that will allow any user participating in a Meerkat stream to send out a single hashtag that will record the stream and then post it to YouTube.

The idea that some third party can record and post your stream even if you yourself do not feels quite risky depending on the content that is being sent out.  In many ways this is no different than a user sending a photo on Snapchat that will be deleted but the recipient uses their phone’s operating system to take a screen capture of the image.  But if your organization doesn’t use Snapchat to send out photos then that may not be an analysis you’ve done.  So it’s something to consider.

Pictured: extensive legal analysis.

Pictured: extensive legal analysis.

Periscope, on the other hand, does not currently have a way for third parties to easily record your stream and post it (although there could certainly be a way to record video sent to watcher’s phones/tablets/computers).  The app will, however, allow you to upload the video to Periscope’s servers and allow other users to watch or re-watch the stream for a period after it was filmed.  That at least gives the broadcaster some control over how long the video will live but is also something that should be considered.

 

It’s exciting to see a new function and communities spring up in the social universe.  We haven’t had a significant new step like this since Pinterest many years ago.  Whether this remains a thriving independent community or more of a feature that everyone will enable (like checking in from a few years ago) remains to be seen.


Kevin HoustonDell VRTX Now Scales Up to 50TB of Shared Storage

A few weeks ago Dell added a 2TB 2.5″ 7.2K RPM Near-Line SAS  drive to the list of supported drives.  This new addition not only increases the shared storage capacity of the Dell VRTX chassis from 48TB to 50TB but it also offers better performance. 

PowerEdge-VRTX-Front-View-with-3.5-Drives_thumb.pngPreviously the VRTX had a maximum shared storage capacity of 48TB which was achieved by using 12 x 4TB 3.5″ 7.2K RPM Near-Line SAS drives.  This required the use of the 12 bay 3.5″ drive model of the PowerEdge VRTX.  Use of 3.5″ drives provides a lot of capacity ideal for archiving or file storage, but for performance it’s not ideal.  Below is the list of current 3.5″ drive options.

3.5” Drive Options (as of March 25, 2015):

  • 300GB 15K RPM SAS 6Gbps 3.5in Hot-plug
  • 600GB 15K RPM SAS 6Gbps 3.5in Hot-plug
  • 1TB 7.2K RPM Near-Line SAS 6Gbps 3.5in Hot-plug
  • 1.8TB 10K RPM SAS 6Gbps 3.5in Hot-plug
  • 2TB 7.2K RPM Near-Line SAS 6Gbps 3.5in Hot-plug
  • 3TB 7.2K RPM Near-Line SAS 6Gbps 3.5in Hot-plug
  • 4TB 7.2K RPM Near-Line SAS 6Gbps 3.5in Hot-plug

PowerEdge-VRTX-Front-View-with-2.5-Drives_thumb.pngThe Dell PowerEdge VRTX also comes in a 25 bay 2.5″ drive model.  As you’ll see in the chart below, the 2.5″ drive model provides many more options, including the ability to hold the largest drive capacity.   The newly announced 2.5″ 2TB drive used with the 25 drive bays will give you 50TB raw shared storage.  In addition, the 2.5″ chassis model provides many more spindles compared to the 3.5″ drive model so you can expect much better application performance.  Also, the 2.5″ drive model is where you’ll find SSD options for those performance hungry applications.

2.5” Drive Options (as of March 25, 2015):

  • 146GB 15K RPM SAS 6Gbps 2.5in Hot-plug
  • 300GB 10K RPM SAS 6Gbps 2.5in Hot-plug
  • 300GB 15K RPM SAS 6Gbps 2.5in Hot-plug
  • 500GB 7.2K RPM Near-Line SAS 6Gbps 2.5in Hot-plug
  • 600GB 10K RPM SAS 6Gbps 2.5in Hot-plug
  • 600GB 15K RPM SAS 6Gbps 2.5in Hot-plug
  • 900GB 10K RPM SAS 6Gbps 2.5in Hot-plug
  • 1TB 7.2K RPM Near-Line SAS 6Gbps 2.5in Hot-plug
  • 1.2TB 10K RPM SAS 6Gbps 2.5in Hot-plug
  • 1.8TB10K RPM SAS 6Gbps 2.5in Hot-plug
  • 2TB 7.2K RPM Near-Line SAS 12Gbps 2.5in Hot-plug

2.5” SSD Drive Options (as of March 25, 2015):

  • 200GB SSD SAS Mix Use 12Gbps 2.5in Hot-Plug
  • 200GB SSD SAS Write Intensive 12Gbps 2.5in Hot-Plug
  • 400GB SSD SAS Mix Use 12Gbps 2.5in Hot-Plug
  • 400GB SSD SAS Write Intensive 12Gbps 2.5in Hot-Plug
  • 800GB SSD SAS Mix Use 12Gbps 2.5in Hot-Plug
  • 800GB SSD SAS Write Intensive 12Gbps 2.5in Hot-Plug
  • 1.6TB SSD SAS Mix Use 12Gbps 2.5in Hot-Plug
  • 1.6TB SSD SAS Write Intensive 12Gbps 2.5in Hot-Plug

A couple of observations about the VRTX shared storage:

  • You can have a mix of drive types (SAS, NL-SAS, SSD) in the shared bays within VRTX but you can’t mix and match when creating virtual disks.
  • If you view the VRTX storage as Direct Attached Storage (DAS) you’ll recognize that you won’t find advance storage features like snapshots, replication, compression, deduplication, etc.

 

 

Kevin Houston - Founder, BladesMadeSimple.comKevin Houston is the founder and Editor-in-Chief of BladesMadeSimple.com.  He has over 18 years of experience in the x86 server marketplace.  Since 1997 Kevin has worked at several resellers in the Atlanta area, and has a vast array of competitive x86 server knowledge and certifications as well as an in-depth understanding of VMware and Citrix virtualization.  Kevin works for Dell as a Server Sales Engineer covering the Global Enterprise market.

 

Disclaimer: The views presented in this blog are personal views and may or may not reflect any of the contributors’ employer’s positions. Furthermore, the content is not reviewed, approved or published by any employer.

 

 

 

 

Dell TechCenterUnderstanding Hosted Private Cloud

Hosted private cloud is a growing part of cloud dynamics and is an important trend in cloud computing. In 2014, cloud entered the formal IT portfolio, and technology managers stopped treating cloud as competition. In 2015, cloud technologies will mature into the driving force powering the most successful companies. Cloud enables unparalleled levels of sustained innovation. Companies that harness its power will win, serve and retain customers better than their competitors - in less time and for less money - if they take advantage of all the cloud has to offer. But where should you start to become better informed?

At the Dell Cloud Insight Series, Lauren Nelson of Forrester, Gerald Seaman of Intel and Ozan Talu of Dell sit around a table discussing hosted private cloud

The first of five Dell Cloud Insight Series featured the thoughts and expertise of Lauren Nelson of Forrester, Gerald Seaman of Intel and Ozan Talu of Dell. The panel addressed the business and IT drivers as well as the business and IT benefits such as:

List of the business and IT drivers as well as the business and IT benefits of hosted private cloud

For more from Lauren Nelson and our panel, click on the livestream webcast below or visit https://fittotweet.typeform.com/to/qjsmvZ

Screen capture from Dell Cloud Insight Series featured the thoughts and expertise of Lauren Nelson of Forrester, Gerald Seaman of Intel and Ozan Talu of Dell

Stay tuned for future Dell Cloud Insight webcasts, including strategy discussions from additional analyst and Dell partners.  For more information on Dell Cloud Services, please visit www.dell.com/mycloud

Dell TechCenterDell simplifies storage management for enterprise Internet of Things (IoT)

Internet of ThingsOver the past few years, the Internet of Things (IoT) has gradually evolved from being a concept into a reality.  Approximately 12.1 billion Internet connected devices were in use in April 2014. By 2020 the number of devices is expected to surpass 50 billion. That's new endpoints deployed at a rate of 250 devices connecting to the internet every second. 

Business Insider separates the IoT market into three main sectors: enterprise, home, and government, and estimates that the enterprise market will account for around 40% of the total devices connected across all sectors.  The enterprise IoT alone is estimated to be larger than the smartphone and tablet markets combined.  Across the world, businesses are connecting their “things” every day to create powerful new business value. These “things” are sophisticated devices that range from tiny sensors to jet engines with massive streams of telemetry. 

Getting past the hype of IoT requires businesses to identify realistic use-cases for leveraging organization-wide analytics to strategic plans and business models, which start as a pilot and gain adoption based on its outcomes and successes. However, as the number of endpoints grows and data-streams pack more information, IoT can quickly overwhelm traditional infrastructure and in particular, the storage capabilities required for the implementation and scaling of IoT initiatives.

Data, in a core IoT system infrastructure, requires data storage capabilities that handle not only an infinite amount of low-value data, but also a high-performance storage tier for data that is critical to business performance in real time.  IoT solutions require high performance access to new machine data as well as an array of business analytics tools, such as Dell Statistica, to provide true business value and meet enterprise expectations.  Additionally, IoT also demands a scalable and interoperable infrastructure to overcome obstacles in large scale implementation of IoT. It should specifically address heterogeneity of IoT devices and the data, and enable seamless addition of new devices across applications. 

Moreover, as raw data is mined and analyzed in meaningful insights, more data gets generated at the infrastructure level.  Businesses need to make sure security is implemented at this level to prevent new data from being breached.  Data security is a primary concern in any IoT environment with multiple connected devices at an enterprise level.  Business critical or confidential information must be protected from unauthorized access and properly disposed when required.  A typical enterprise will generate much more data and therefore have more robust security requirements than a typical consumer.  There have been some concerns that early generations of IoT devices are vulnerable to attack. 

Dell IoT Solutions

Our goal is to enable IoT to help our customers run their businesses better.  We have storage solutions for IoT that can improve efficiency, simplify storage administration, reduce cost, enable secure information sharing, and increase data security.  Depending on the IoT dataset scale, our architecture agnostic solutions ensure the customer’s IoT solution meets their unique needs without imposing our view, provide robust analytics for insight-driven action, and allow them to scale from pilot to production quickly and cost-effectivelyDell SC Series SANs, with technology, such as self-encrypting drives, can help prevent unauthorized access to administrative interfaces and protect data at rest, while leveraging the standard enterprise class features, such as tiering, replication and snapshots.  In a core IoT storage infrastructure, Dell Storage All-Flash and  Flash-Optimized array offerings could deliver high performance to analytics tools in the high-valued data tier while dense arrays with nearline drives could be used for lower valued data that is aging towards its retention life.  Additionally, moving “hot” files to our DAS based offerings could further create a sophisticated storage tiering structure for your IoT environment. We just implemented a storage solution to help Wheeling Hospital manage its complex data infrastructure environment, which is a good representation of a typical implementation for storage IoT. To stay updated on more use cases for IoT, subscribe to this blog and follow us @Dell_Storage on Twitter.

Dell TechCenterSecuring enterprise mobility: Where are your weak links?

As you launch a new mobility program or expand access to enterprise resources, security must be a top priority. You need ways to enable anytime, anywhere productivity without jeopardizing the security of enterprise applications and data. And as you’ve...(read more)

Dell TechCenterIncrease Productivity with Mobile Access to More Resources, More Securely

Easy access to business data and resources is critical for employee productivity. And enabling that productivity securely, for today’s mobile workforce, is a big opportunity for resellers to grow their mobile security business and delight SMB customers. To help our Dell Partners achieve this, we offer Dell SonicWALL Secure Remote Access (SRA) series appliances that provide mobile and remote workers using smart phones, tablets or laptops — whether managed or unmanaged — with policy-enforced SSL VPN access to mission-critical applications, data and resources without compromising security.

With the newest release, Dell SonicWALL Secure Remote Access (SRA) 8.0, mobile workers can get access to even more resources, more securely and Dell Partners can provide greater value to their SRA customers and grow their mobile security business.

Enable access to a broad suite of applications and resources

The applications and resources your customers depend on to successfully run their operations are likely unique to their business, and may include web apps, file shares, hosted virtual apps, client server apps and more. To meet the mobile access needs of these customers, they need an access gateway that can enable secure access to the broad suite of resources and applications necessary for workers to be most productive.

Provide secure access from smartphones tablets, laptops and PCs

Also, the device types and devices workers want to use to access company resources and applications is growing exponentially. To meet the secure access needs of today’s workforce, access gateway’s need to support not only access from PCs and laptops, but also from popular smartphone and tablet devices and operating systems.

Access to more resources, more securely with HTML 5 browser support

To help meet these needs, the Dell SRA release 8.0 adds support for access to more resources using standard HTML 5 browsers, available for most smartphones, tablets and laptops. In addition to existing SRA support for HTML 5 browser access to RDP and VNC applications, the new release adds HTML 5 browser access to File Shares, FTP, Telnet and SSH services. Users with HTML5 browsers can now securely access these resources without requiring the Java or ActiveX browser plugins that legacy web browsers require, reducing threat risk and complexity. Also, users with devices that traditionally don’t support Java or ActiveX web browser plugins, such as iOS devices, can now use standard HTML5 browsers to gain access to allowed resources. In addition, to simplify the user experience, the new SRA 8.0 release now supports single sign on for access to RDP and VNC servers and file shares.

The Denver Broncos enable mobile worker productivity without compromising security

SRA customers, including the Denver Broncos, tell the story best. Running a leading NFL team requires a lot of behind-the-scenes work, and the Broncos use innovative IT to help the organization stay productive and keep fans happy. Coaches, cheerleaders, marketers and other staff — totaling 300 people — use the Bronco’s private and virtual private networks (VPN) to share classified data including videos, playbooks and scouting information. The entire organization uses Dell SonicWALL Secure Remote Access (SRA) appliances to deliver safe and easy access to email, files, applications, and more. Watch this short video to hear how the Broncos increase productivity, securely, with SRA.

(Please visit the site to view this video)

Learn more about Dell Secure Mobile Access solutions

The new SRA 8.0 release is now available to SRA customers with current support contracts and can be downloaded here.

For more information regarding Dell SonicWALL secure mobile access solutions, click here to visit our website. Additional information for Dell Partners can be found on the Dell PartnerDirect portal here. And if you’re not currently a Dell Partner but interested in learning more about joining the Dell Partner Program, please refer to the Dell PartnerDirect portal here

Dell TechCenterMaintaining a Personal Touch is a Must-Have for ITO Service Providers

There’s an interesting paradox happening in the outsourcing industry: as more businesses are looking at outsourcing as a viable alternative to many IT infrastructure challenges, more processes are being successfully automated — taking time and error out of many outsourced processes. In fact, the rise of automation in many business processes has come to be seen as a huge benefit to both IT outsourcing customers and providers.

However, the strength of a successful outsourcing relationship is largely due to the effectiveness and strength of the communication between customers and providers. And that’s a critical role in the relationship that can’t be replaced by an automated service.

That’s the reasoning behind a recent whitepaper from Dell Infrastructure Managed Services. This insightful paper reiterates the importance of communication and personal interaction in an offshore/outsourced model — as well as how that same communication is a clear indication of a service provider’s ability to provide high-quality services and a customer experience that meets and exceeds expectations. Deepak Satya, Director—Solutions, Dell Services, relates how, “Seeing the customer in person makes employees feel more appreciated and satisfied with their jobs, making them willing to go the extra mile. And when customers appreciate the job done, employees are more likely to go off script, creating a more personalized customer service experience and adding value to the interaction.”

Putting the person in personal

The deeper the personal relationship the better for all involved. In other words, both IT service providers and customers benefit from strong communication and face-to-face interaction. “From a customer perspective, interactions that are more personal improve their opinion about the employee’s contribution, quality of service and the company as a whole,” Deepak writes. Of course, that’s not always possible, as many growing businesses have travel restrictions and off-shored services aren’t always in compatible time zones.

Despite the obvious challenge of distances, Deepak reasons that service providers can easily implement five steps that strengthen personal interactions, including:

  1. Increasing customer touch points.
  2. Ensuring every member of the team goes through customer-specific onboarding programs.
  3. Making an effort to introduce customers to team members, using online video conferencing solutions.
  4. Using online collaboration tools, shared applications and team workspaces
  5. Adopting social media channels, including blogs and wikis, to help customers and team members collaborate.

“At Dell, we believe that service providers need to look at customer service as a fundamental part of the overall service delivery — utilized as both a key tool to grow the value of the customer base, and an important strategic differentiator,” Deepak explains. To be successful, customer service interactions should be seen as opportunities to deepen understanding of unique customer needs, and should be nurtured to provide a strong support structure for both customers and providers.

 

Cover of whitepaper titled: Bridging the Gap Between Contact and communicateAbout the whitepaper

While the demand for offshore services is still largely driven by costs, the ongoing rise in intellectual capital among offshore services firms enables an increasingly quality-driven value proposition. Organizational effectiveness is key to improving service quality. But in an offshore/outsourced model, a lack of communication and personal contact when delivering IT services may be perceived as an absence of organizational effectiveness. This paper discusses the importance of communication and personal interaction in an offshore/outsourced model and its reflection on a service provider’s ability to provide rich, high-quality services and a comprehensive customer experience.

Download the whitepaper

 

About the author

As the Director—Solutions, Dell Services, Deepak Satya heads the solution center for APJ, EMEA and the Healthcare vertical. Having conceptualized and created the solution center hub in India, Deepak has assumed responsibilities for keeping Dell Services and its customers ahead of the curve, by developing new services, identifying transformational opportunities and creating solutions. Prior to joining Dell, Deepak has defined and aligned the services vision with business goals for Wipro Infotech, Wipro Technologies and Cognizant. He has extensive expertise in creating IT infrastructure services strategy, architecture, methodologies, standards and governance and is skilled in establishing and managing high-performance global teams, using blended onshore/offshore delivery models.

Connect with Deepak on LinkedIn

 

Dell TechCenterEnterprise Reporter Reporting: Scheduling Reports - How can I send a report to my Boss?

In the first article of this reporting series, Valerie gave us an overview of the Enterprise Reporter Report Manager and explained My Reports, Published Reports, and the Report Library. Today, let's take a closer look at what to do when you have spent...(read more)

Dell TechCenterDell Precision Appliance for Wyse: workstation virtualization for the rest of us

Last week, Dell announced the industry’s first ever ISV-certified virtual workstation appliance called the  Dell Precision Appliance for Wyse. The announcement pairs the industry-leading Dell Precision and Wyse brands and reinforces Dell’s leadership in both the workstation and desktop virtualization categories. With this new appliance-based solution, we are now raising the bar in terms of simplification and time to value for a whole new set of customers, many of whom may not have the resources or skills to adopt workstation virtualization.

So why is virtualization technology important?  Simply put, the days of every user having a full stand-alone computer operating in isolation is coming to an end Even smaller customers need to collaborate within a team, share information, work on the road or from home, and protect ever increasing volumes of mission-critical intellectual property.  Virtualization gets this data away from the edge of the network where it is vulnerable and difficult to manage and puts it into a secure, central location.

The benefits are numerous. Here at Dell we talk to customers every day and know the value of virtualization in real world situations. For example:

  • An architecture firm on the East Coast is dealing with design files that are close to a terabyte in size. Copying information from file servers to local workstations causes a significant delay in their workflow. By moving the data and workstations back into a data center with centrally managed storage and a fast network backbone they intend to cut hours of wasted time from their development process.
  • Another customer with racks of 1:1 remote workstations has run out datacenter space. By adopting a virtualization solution, they will dramatically reduce the footprint of their equipment  and delay the need to expand the data center for several years.
  • Can’t finish a project due to snow days? A customer in the Northeast has been locked out of work for many days this year due to snow. Having the ability to access a virtual workstation from home would buy them back valuable productive time as well as enable other remote working scenarios.
  • For most customers, the need to secure the data is universal. By implementing a virtualized workstation solution, customers can centrally manage the security of their IP and greatly reduce the likelihood of theft or leakage on endpoints that are hard to track and manage. 

With the release of the new Dell Precision Appliance for Wyse, we will take what has been a complex and difficult integration task and make it as simple as buying a new workstation. The appliance can be deployed in just five minutes once powered on. It is based on VMware hypervisor technology and supports the latest graphics and remote workstation technology from NVIDIA and Teradici respectively to deliver “no compromise” performance and reliability to the most power-hungry users. In addition to excellent technical performance, we are also working closely with our ISV application partners to certify their applications on the appliance so that customers can run their key workloads with confidence. 

At Dell, we are continuously innovating to provide new solutions to meet our customers’ evolving needs. We are excited to make workstation virtualization easy to deploy and run for a new set of customers who are looking to be more mobile and collaborative in graphics-intensive work environments.

Dell TechCenterFlexibility in #SharePoint Migration – #Dell Migration Suite for SharePoint, Now Available in Microsoft #Azure Marketplace

When it’s time to migrate your SharePoint content, it can be a daunting task—one that gives even the most seasoned IT professionals pause. Third-party tools, such as Migration Suite for SharePoint, can offer significant flexibility with your...(read more)

Dell TechCenterWhy Dell SonicWALL CSSA Certification Training at EMEA Peak Performance 2015?

We have this exciting offer for partners who register for the upcoming EMEA Dell Security Peak Performance 2015 . Here are the benefits for you and your organization in taking advantage of this valuable offer. Each class is taught live, by a certified...(read more)

Dell TechCenterThink you’re a savvy virtualization admin? Read the e-book and find out.

Can you really learn everything you need to know about virtualization from one e-book? You can if it’s “ The Definitive Guide to Virtualization Management Software, ” prepared for Dell by David M. Davis, winner of the VMware vExpert...(read more)

Dell TechCenterDell’s Market Share Momentum Continued in Q4

Illustrating the strength of our end to end solutions portfolio, Dell’s market share momentum continued in the fourth calendar quarter of 2014 across multiple lines of business.  According to data recently released by the industry analyst firm International Data Corporation (IDC), Dell was able to grow faster than the industry in several key areas.

Here is an overview of our market share performance by product line:

Client Solutions

The fourth quarter of 2014 marked the eighth consecutive quarter that Dell’s PC business grew faster than the industry.  In a consolidating market, Dell grew 8.5 percent year-over-year, gaining +1.5 points of market share, with gains in every region, in both desktops and notebooks, and in both consumer and commercial.  We believe our award-winning products, highlighted by our recent success at CES, will enable us to continue this momentum in 2015.

Cloud Client Computing

As the industry leader in enterprise client devices, Dell regained the #1 worldwide ranking in Q4, growing 12.0 percent (vs. overall market decline of -12.5 percent) and expanding our market share by +5.9 points to 27.2 percent.

Mainstream Servers

Dell’s success in the mainstream x-86 server market accelerated in Q4 as we were the only major vendor to take both year-over-year and sequential unit share.  Dell grew units by 8.6 percent in a market that grew at only 7.0 percent, allowing us to gain +0.3 points of share year-over-year and maintain our #2 worldwide spot with 24.7 percent market share.  With our most advanced server line to date, including the 13th generation of PowerEdge servers which continue to be recognized for their performance and system management capabilities, we believe we are well positioned in this accelerating market in 2015.

Storage

During Q4, Dell was the only vendor to grow faster than the industry on a year-over-year basis.  Dell grew 6.4 percent, taking +0.2 points of share, the first time we have outgrown the industry in 12 quarters.  In addition, during the full year 2014, we held the number one position in total storage capacity shipped including both internal and external storage.  We continue to make investments in our storage business, including the recently announced Dell Storage all-flash array, which has the lowest entry price for an all-flash, mid-range storage solution by any major vendor.

Networking

Finally, Dell Networking momentum continued in Q4 with faster-than-market sequential growth and 2.8 percent year-over-year growth.  With a market-leading approach around open standards, Dell Networking provides a robust infrastructure to scale our customer’s growth.

In 2014, Dell significantly expanded its portfolio of notebooks, desktops, thin clients, servers, storage and networking products, backed by industry-leading support, to deliver our customers the world’s most secure, reliable and valued technology solutions.  These products and services, paired with Dell’s focus on a best-in-class customer experience, translated into a wide range of share gains during the fourth quarter, and set the foundation for continued momentum in the coming year. 

Dell TechCenterHelp employees achieve a better work-home balance

When does the work part of your life end and the personal begin? If the results of the 2014 Global Evolving Workforce Study are any indication, your work life and home life have basically merged — into life. The mixing of work and home is not...(read more)

Dell TechCenterWhat’s New for Desktop Authority – February 2015

Updated monthly, this publication provides you with new and recently revised information and is organized in the following categories; Documentation, Notifications, Patches, Product Life Cycle, Release, Knowledge Base Articles. Subscribe Knowledgebase...(read more)

Dell TechCenterA Single Approach to Anypoint Systems Management — Unifying Your View of the Environment

You really can achieve a single view of your entire connected environment.

Throughout this series of posts I’ve emphasized that full control with a single, overall view is the promised land for systems management in the age of mobility, BYOD and the Internet of Things (IoT). It’s the only way you can keep the corporate data on your users’ personally owned smartphones, tablets and PCs as secure as the datat on your corporate-owned computers and servers, while also managing a host of new, network-connected, non-computer devices.

The combination of the KACE K1000 Systems Management Appliance and Dell Enterprise Mobility Management (EMM) can help you reach the promised land of anypoint systems management without the patchwork of point solutions you’ve likely accumulated. As shown in the diagram, the Dell KACE K1000, through its integration with Dell EMM, provides you with a single, integrated view of your entire environment – corporate-owned devices as well as secure workspaces:

  1. Systems management of traditional devices like desktop PCs, laptops, Macs and servers, as well as network-connected non-computer devices with the K1000
  2. Systems management of all mobile devices, corporate-owned as well as BYOD, by EMM with integration of all asset data into the K1000
  3. Systems management of the secure workspaces within BYO PCs

Did we leave any devices out? We don’t think so. The Dell KACE-Dell EMM solution gives IT administrators the control and insight they need, while giving users the privacy they want and the freedom to be productive using their own devices.

Next steps

Once you’ve read Part 2 of our e-book, “A Single Approach to Anypoint Systems Management,” you’ll want to see a live web demo of the Dell KACE K1000 Management Appliance and Enterprise Mobility Management (EMM) solution. You can also take your own tour and interact with a live KACE K Series Appliance in our Demo Sandbox.

Dell TechCenterThe Elephant in the Room

Exploring the benefits offered by a Luster / Hadoop hybrid cluster. (read more)

Dell TechCenterThe Elephant in the Room

Exploring the benefits offered by a Luster / Hadoop hybrid cluster. (read more)

Dell TechCenterOffice Productivity on an Android tablet? It’s Coming.

Office productivity on an Android Tablet

Today’s knowledge workers are being asked to do more and more to help maintain a competitive edge, which has led to the blurring of lines between a work environment and personal life. Many people consider their workspaces to include airports, hotel rooms, customer sites or the home office. When 9-5 is no longer 9-5 and work is an activity instead of a set location, people want to be productive regardless of which device they are using and where they are – the key is to get the job done.


Staying productive on the goTechnology is increasingly enabling us to work how and when we want, and more importantly wherever we are. Recent studies such as the Evolving Workforce Study by Dell and Intel have shown that we are using multiple devices to complete our work in the most effective way possible. It’s no surprise that IT Decision Makers around the world are seeing productivity gains of 20% or more with tablet adoption. Most professionals choose the device most suited or most convenient to complete the task in hand rather than be constrained by location, device or technology. If someone is at their child’s baseball game and they get an urgent request to run some numbers, they want to be able to access, read and edit that document on the device they have on hand – and still catch the game.

That spreadsheet could just as easily be a presentation or contract, and those documents are likely based on Microsoft Office products like Word, PowerPoint or Excel. We are really excited that Microsoft is bringing its Office productivity suite to Android, as it brings increased business capabilities to the platform, and provides customers with true accessibility to the documents that they use most often across any of their devices. Office for Android will be available on Dell Venue tablets this summer.

Dell premium Android tablets, like the Venue 8 7000, will also soon be enabled for Android for Work. This platform makes it possible for business and personal information to securely coexist on one device, creating a standard, secure way to use one device for both work and play. When combined with Microsoft Office for Android, it provides true productivity for Android-based tablets.

Regardless of devices or operating system, security is paramount to ensure that corporate information stored in documents and apps is not compromised while users are working beyond the corporate firewall. Dell has invested considerable time in developing end point security and enterprise mobile management solutions such as Dell Mobile Workspace and Dell Mobility Management to ensure data integrity at all times while providing users with the ability to easily move their applications seamlessly from one device to another.

Our customers have repeatedly told us that they do not want to go to multiple vendors to pull together a solution that helps secure their data as it moves between people and devices, and manage the numerous devices that now reside on their network. Dell has the most holistic solution for organizations to enable their increasingly mobile workforce, because we offer a consistent experience across devices and operating systems.

Dell TechCenterWhat's new for vWorkspace – February 2015

Updated monthly, this publication provides you with new and recently revised information and is organized in the following categories; Documentation, Notifications, Patches, Product Life Cycle, Release, Knowledge Base Articles. Documentation ...(read more)

Dell TechCenterAdrian Grenier’s Latest Role? Dell’s Social Good Advocate

You may have seen the recent announcement of our first-ever Social Good Advocate Adrian Grenier and wondered just what it means to take on that role. The actor told a group of us gathered to hear more last Friday, that to him it presents a unique opportunity to affect positive change.

“I think it's exciting to be the first ambassador of this sort for Dell. And it's a true commitment by them to doing good,” he said. “I think one of the reasons I was asked to come on board was because I have an ability to bring people together.”

Adrian Grenier in the Dell Lounge at SXSW

With the announcement of the new role so recent, he says that’s exactly the first thing to do – gather around a table with our Dell team and talk about what possibilities exist for us to work together.

"We’re just getting to know each other, but one thing that I've gathered is that Dell has real tangible goals which is really exciting. When you have real goals that you can actually measure and know whether or not you’re being effective, that’s really useful,” Grenier said.

They are goals that we first started articulating in 1998, and as knowledge has increased since that time, we’ve evolved them into what is now called our 2020 Legacy of Good Plan. It’s the culmination of nearly 18 months of work that took place at the same time we were in the process of taking the company private.

“So when we were presenting this plan to our leadership, we had to ask what impact that might have,” David Lear, executive director of our sustainability programs said. “And unequivocally, they agreed that this was important and no matter what happened, our company would remain committed to it.”

A company our size brings scale to social change and creates infrastructure for others.

“Scale comes slow, though. We offer free recycling in 78 countries, but that didn't happen overnight,” Lear said.Dell Reconnect, a partnership with Goodwill Industries, has enabled easy drop off of unwanted technology. It is a great example of the type of partnership we're looking for, but they start small and then get bigger.”

We think bringing a social good advocate like Grenier to our team will help us spotlight sustainability initiatives, and inspire responsible lifestyles in a way that feels simple and easy.

“The 38-year-old actor/filmmaker/entrepreneur - ironically - was the epitome of conspicuous consumption as Vince Chase, the lead character in Entourage,” John Swartz noted in USAToday. “Off-screen, he is spreading and living the gospel of green. He uses shredded jeans for insulation, for instance.”

That commitment was critical for Dell.

“I admit I’ve never watched Entourage. I haven’t had cable in 10 years,“ Lear said, “I knew Adrian more for his entrepreneurial and environmental endeavors than for his acting career. But a little star power doesn’t hurt us either.”

Writing for Mashable, Lance Ulanoff noted that the actor is especially adept at blending his celebrity profile and social awareness pursuits.

“When people come up to me and they want to take a picture or get an autograph, to me I think it’s a wasted opportunity if I just walk away. I have an opportunity to really connect with somebody in that moment,” Grenier told Ulanoff.

Dell TechCenter#TweetChat #IoTdell - Understanding the Big Data Effect on Commercial and Industrial IoT


The Internet of Things (IoT) presents a new and compelling opportunity for companies to gain deeper insight into their business processes, customers and workflows. Combining machine and sensor data with traditional enterprise information adds greater value to analytics and provides a deeper understanding of the business.

Join Dell Software’s Subject Matter Expert for Business Intelligence and Business Analytics Joanna Schloss (@JoSchloss) along with Chief Research Officer (IMG) Shawn Rogers (@ShawnRog) for a live conversation on twitter as we explore the Big Data Effect on Commercial and Industrial IoT.

 

The #TweetChat agenda includes:

  • Evolution or revolution...did IoT come about as evolution or revolution and why does this matter?  
  • How does IoT affect the big data landscape?  
  • What’s the difference between Commercial and Industrial IoT?
  • What about Privacy? Can we know too much?
  • Are there best practices companies should keep in mind?

 

Where: Live on Twitter – Follow Hashtag #IoTdell to get your questions answered and participate in the conversation!

When: April 3rd 2015 at 10:00 AM PST

William LearaReview Period Has Begun for Spec Updates: UEFI v2.5, PI v1.4, ACPI v6

Uefi_logo_redBig news coming in the BIOS world—updates to the specifications that define UEFI, PI, and ACPI.  See note below from the UEFI Administration:

Dear UEFI Promoter and Contributor Members:

This note is to inform you that the Review period begins today for these 4 products of the UEFI specification process:

*    UEFI Specification 2.5
*    PI Specification 1.4
*    ACPI Specification 6.0
*    SCT Version 2.4B

The final drafts of these updated specifications were approved by the Board of directors on March 10, 2015.

Please refer to the terms of Section 6.4c and 6.4d of the Bylaws for details of the Review and be aware of the following required notifications:

*    The Review period begins today, March 13, 2015
*    Any comments you may have that you feel will require material changes to content in the Final Draft must be sent to the UTWG and the Board on or before 5 PM Pacific Time April 13, 2015.
*    Any Disclosures you may have relating to Contributions to the Final Draft must be sent to the Secretary of the Board on or before April 13, 2015 (by 5pm Pacific time).
*    The Review period ends on April 13, 2015 (by 5pm Pacific time).
*    The proposed date for the Board to hold an Adoption meeting to address any comments from the Review and disposition the Final Draft is April 14, 2015.

For your convenience all electronic email communication regarding this draft may be sent to admin@uefi.org. The UEFI mail address is:

UEFI Forum Administration
3855 SW 153rd Drive
Beaverton, Oregon 97003 USA

These draft specifications are only available to UEFI Promoter and Contributor member viewing, so I cannot produce them here.  Please access the member section of http://uefi.org to download the specs and learn more.

Dell TechCenterEffective Systems Management in a Multi-Platform World — New Tech Brief

If variety is the spice of life, then why does it cause such headaches for IT?

Most IT administrators walk into heterogeneous computing environments every day knowing they’ll have to manage computers and servers running several flavors of Windows, Mac OS X, UNIX, Linux and, more recently, Chrome OS. Efficiently managing such multi-platform environments is not easy. Tasks like keeping track of all hardware and software, making sure you are in compliance with applicable regulations and licensing agreements, ensuring security, and providing exceptional support become increasingly difficult as your environment becomes more complex.

Systems management with multiple operating systems

In my last post I introduced “Best Practices in Lifecycle Management,” a new position paper from Enterprise Management Associates, Inc. (EMA), and outlined the many IT disciplines that lifecycle management covers, if you’re doing it correctly.

Strong support for heterogeneous computing environments is a must-have in a systems management product, and the paper includes a section specifically identifying the OSes supported by each of the four most popular systems lifecycle management solutions on the market:

  • Dell KACE K1000 Systems Management and K2000 Systems Deployment Appliances
  • LANDESK Management Suite 9.6 SP1
  • Microsoft System Center 2012 R2 Configuration Manager (SCCM)
  • Symantec Altiris Client Management Suite (CMS) 7.5 SP1

If you spend your workday managing a multi-platform environment, then you’ll find EMA’s checklist a valuable resource in preparing for your next investment in a systems lifecycle management product.

Get a free copy of the tech paper and have a look at the section “Heterogeneous Support” on page 5 of the PDF.

Dell TechCenterWebcast: Eliminate Active Directory Disaster Recovery and Migration Headaches the Virtual Way

If you are planning an AD migration or preparing an AD disaster recovery plan, you'll want to test your plans to ensure success. But how do you create a test environment and who even has the time? Register for this educational webcast . Industry...(read more)

Jason BochevCloud Director Database Migration

This week I’ve been working on getting some lab infrastructure fitted with much needed updates. One of those components was an aging Microsoft SQL Server 2008 R2 server on Windows Server 2008 R2 which I had been using to host databases for various projects.  Since I had chosen to build the new SQL server in parallel, I’m benefiting with fresh and problem free builds of Microsoft SQL Server 2012 on Windows Server 2012 R2.  The downside is that I’m responsible for dealing with all of the SQL databases and logins and potentially scheduled jobs that must be migrated to the new SQL server.

vCloud Director is one of the last databases left to migrate and fortunately VMware has a KB article published which covers the step required to migrate a back end SQL database for vCloud Director.  The VMware KB article is 2092706 Migrating the VMware vCloud Director SQL database to another server.

Looking at the steps, the migration looks like it will be fairly simple.  VMware even provides the SQL queries to automate many of the tasks.  I’ll migrate my vCloud Director database using these steps in the following video.  I did run into a few issues which mostly boil down to copy/paste problems with the SQL queries as published in the KB article but I’ve provided the necessary corrections and workarounds in the video.

As shown in the video, I ran into a syntax issue with step four.

The SQL query provided by the KB article was:

USE master;
GO
EXEC sp_attach_db @dbname = N’vCD_DB_Name‘,
c:\Program Files\Microsoft SQL Server\MSSQL\Backup\vCD_DB_Name.mdf
c:\Program Files\Microsoft SQL Server\MSSQL\Backup\vCD_DB_Name.ldf
GO

The corrected SQL query syntax according to the Microsoft SQL Server Management Stuido appears to be:

USE [master]
GO
CREATE DATABASE [vCD_DB_Name] ON 
( FILENAME = N'c:\Program Files\Microsoft SQL Server\MSSQL\Backup\vCD_DB_Name.mdf' ),
( FILENAME = N'c:\Program Files\Microsoft SQL Server\MSSQL\Backup\vCD_DB_Name.ldf' )
 FOR ATTACH
GO

Another issue I’ll note that wasn’t captured in the video deals with step seven where the vCloud Director cell server is reconfigured to point to the new database.  The first time I ran that step, the process failed because the cell attempted to locate the SQL database in its original location which it actually found. When this occurred, the cell configuration script doesn’t prompt me to point to a new SQL instance.  In order for step seven to work correctly, I had to drop or delete the database on the SQL 2008 R2 server and rerun the vCloud Director configuration script.  What happens then is that the cell doesn’t automatically ‘find’ the old instance and so it correctly prompts for the new back end database details.  VMware’s KB article provides most of the steps required to migrate the database but it does need a step inserted prior to step seven which calls for the deletion of the original database instance.  Step two places the vCloud database in READ_ONLY mode but the vCloud cell configuration was still able to ‘see’ which causes step seven to fail.

Blake Garner (@trodemaster on Twitter) provided a helpful tip which will also work with step seven in lieu of dropping or deleting the database on the original SQL server:

You could also clear DB config from the /opt/vmware/vcloud-director/etc/global.properties and run configure again.

Overall the process was still fairly simple and painless thanks to VMware’s published documentation.

Post from: boche.net - VMware Virtualization Evangelist

Copyright (c) 2010 Jason Boche. The contents of this post may not be reproduced or republished on another web page or web site without prior written permission.

vCloud Director Database Migration

Dell TechCenterAudit Oracle Data Changes With SharePlex’s Change History Feature

Instead of implementing Oracle Audit_Trail to capture metadata on data changes like userid, timestamp, type of operation, etc., avoid the overhead and capture more information using SharePlex’s Change History. SharePlex uses its CDC replication...(read more)

Dell TechCenterLearn to Work with eQuotes

The eQuote serf-service feature, allows an authorized user to create, review, edit and simply retrieve an electronic Quote when ready to purchase. Below we go into how to save, retrieve and edit the eQuotes you create within your Premier page. As well...(read more)

Dell TechCenter5 Ways to Shop and Search Products on Premier

Premier provides you with a customized, secure online toolset for purchasing, reporting, researching product and support. This guide shows you how to make the most of this customized procurement tool. Provided below is five ways in which you can shop...(read more)

Dell TechCenterThe top 3 considerations for developing a Proof of Concept for OpenStack

A look at three important considerations for developing a Proof of Concept for OpenStack. (read more)

Dell TechCenterWhen smart people won’t use smart technologies: Avoiding resistance and ensuring success for analytics projects

By Dr. Thomas Hill, Executive director, Analytics, Information Management Group, Dell

Big-data predictive analytics offer the promise of better outcomes and lower costs for healthcare organizations, effectively allowing a patient to access the expertise of thousands of experts gained through treating millions of patients. But successfully deploying the technology isn’t always easy. How you plan for and introduce analytics is critical to acceptance by stakeholders and a willingness to take action based on the knowledge you generate.

The Locomotive from Nurnberg to Furth: The fear effect of technology

Disruptive technologies almost always elicit initial skepticism and even fear. When the first steam locomotive prepared to make its run from Nurnberg to Furth in 1835, people were concerned about noise and pollution and feared that human physiology might not support travel at speeds over 20 mph. Other useful-but-disruptive advances have also generated initial fear.

In research published over 30 years ago, I demonstrated how a lack of a sense of control generated fear when personal computers revolutionized computing at universities. Interacting with a black box that generates results as if by magic, without giving any control to the end-user, will always generate distrust.

Projects will fail if analytics technologies are perceived to usurp personal judgment and control over final decisions.

Presenting analytics to stakeholders as a tool they can use will empower them and pre-empt fear. Demonstrate how these new tools help healthcare professionals to quickly evaluate risks, potential outcomes, and what-if scenarios. Predictions, recommendations and prescriptions derived from analytics need to come with reasons why a particular risk is indicated and how recommended actions will affect outcomes.

Avoid alarm-fatigue with unambiguous, actionable alerts

People cease to pay attention when alarms are too frequent or information doesn’t present clear options for action.

Enhance rather than add to existing processes, screens and alarming rules and ensure that important information is unambiguous, actionable and consistent with existing work flows. Think through where analytic results will be used, identify benefits and ROI and make certain that information is actionable. For example, Dr. John Cromwell implemented a system at the University of Iowa Hospitals and Clinics which sends real-time, actionable risk information to the operating room that is helping surgeons avoid post-surgical infections.

Don’t add to the onslaught of computer work

General surgeon Jeffrey Singer recently noted in the Wall Street Journal that rigid electronic health records systems promote “tunnel vision in which physicians become so focused on complying with the EHR worksheet that they surrender a degree of critical thinking and medical investigation.”

Analytics technology should be entirely hidden, yet deliver reliable information about risks, best next action and alternatives. Don’t require medical professionals to complete yet another computer screen.

Lab tech at microscopeKnow the end point and how to measure results

One of the most important things to consider before embarking on any IT project is to clearly establish what the completed project would look like and how to measure success. Avoid projects that are attractive because of the “cool” technologies involved without clear definitions of success and ROI.

Think about what ideal results look like; who would use them and how; and how something of value would be created. Involve key stakeholders and end-users to reflect their concerns and perceived barriers to success. Once you know how success is exactly and operationally defined, everything else follows, such as where to look for what data, how results are delivered, what level of integration, training, operational changes or new resources/personnel are required.

Decide what data you need

Data acquisition and preparation is always the most time-consuming and difficult part of any advanced predictive analytics project. EMR systems are mostly closed and data from different sources and repositories use different labels and metrics for the same measurements. For example, reports from different laboratories may use different formats, scales and nomenclatures.

Before you begin, think through whether you need immediate ROI for a specific project or a more complete analytics solution. A project-specific approach allows you to go after low-hanging fruit using data that are easiest to get and integrate; a longer term approach, to support diverse projects, requires building a robust general data-analysis warehouse with a Master Patient Index, terminologies and translation logic, and incorporates adapters to allow integration with relevant data sources.

Create governance

Finally, there is governance, though ideally this would come first. Often overlooked, governance is important for two major reasons. First, many projects initially succeed, but then fold after the project champion departs, leaving nobody who knows and understands how it all works and where the data are. Second, regulatory oversight and scrutiny will become important when analytics affect real patient outcomes.

A role-based system with lifecycle management, version control, audit logs, approval processes, etc., will solve the issue of departing champions as well as the need to document how predictive models were built, validated, approved and implemented.  Good examples of this can be found among pharmaceutical and medical device manufacturers, which have for years incorporated mature governance features to meet these challenges.

Summary

When advanced analytics projects fail and smart professionals decide not to leverage smart technology to improve outcomes, the cause is often project leaders who ignore critical steps when planning and implementing such systems. Future healthcare will inevitably rely greatly on advanced predictive and automated analytics to help health care professionals produce better patient outcomes more reliably and effectively. Getting it right from the start will create that future faster and benefit everyone.

I’ll be at HIMSS, April 13-15 in Chicago, and leading a tweet up discussion on the future of population health management on April 15 at 11 am. I look forward to hearing your thoughts on this topic. 

About the author

Dr. Thomas Hill is Executive Director for Analytics at Dell’s Information Management Group. He joined Dell through the acquisition of StatSoft Inc. in 2014, where he was Senior Vice President for Analytic Solutions for over 20 years, responsible for building out Statistica into a leading analytics platform. He was on the faculty of the University of Tulsa from 1984 to 2009, where he conducted research in cognitive science. Dr. Hill has received numerous grants from the National Science Foundation, National Institute of Health, Center for Innovation Management, Electric Power Research Institute, and other institutions. Over the past 20 years, his teams have completed consulting projects with companies across various industries in the United States and internationally. Dr. Hill is the (co)author of several books, most recently of Practical Text Mining and Statistical Analysis for Non-Structured Text Data Applications (2012) and Practical Predictive Analytics and Decisioning Systems for Medicine (2014).

Dell TechCenterTablets: Productivity gains without security compromise

Tablets have made inroads into the business world, thanks to strong demand from a mobile workforce. Many organizations now include tablets in their standard IT offerings, whether part of a corporate-owned or bring-your-own-device (BYOD) program. But are...(read more)

Dell TechCenterCIOs Gain Productivity and Control with Dell Boomi’s API Management, Integration and Master Data Management (MDM)

Today’s CIOs need to ensure efficiency and maintain service levels for their organization—and to accomplish that in the face of the explosion of web services, they need to automate the way they manage those services, whether in the cloud or on-premises. Forward-thinking CIOs are turning these web services into APIs—but managing these APIs throughout the entire lifecycle requires significant time and resources, unless you find a way to automate it.

Dell Boomi API Management offers a way to manage the complete lifecycle of those APIs, allowing customers to create, publish and manage APIs on a single multi-purpose Platform as a Service (PaaS)  that seamlessly brings together API Management, Integration and Master Data Management (MDM). And it does that with a consistent look and feel that decreases complexity and increases developer productivity. So now your business can move, manage and govern data from a single platform.

“One of the most significant challenges we face today is to support the development and deployment of new web services that connect with our backend systems, so we can expose new services and capabilities to our customers via our website,” stated Ken Bol, Integration Specialist at Brady Corp. “Dell Boomi gives us the ability to quickly create, publish and manage these services and bring new value to customers and our business.”

More SaaS and Legacy Data Means More APIs

As line-of-business managers seek to gain competitive advantage by implementing new applications, often as cloud-based applications, the number of data endpoints CIOs need to manage increases, as well. At the same time, these executives are still relying to some degree on legacy systems that must be integrated within a hybrid IT environment—a condition that will continue for many years to come.

Add to that the continued expansion of social media, reliance on mobile devices, and the rise of the Internet of Things (IoT) and Big Data, and one thing is clear: CIOs need to take action now.

According to Gartner, API Management Services are forecast to grow from $552.8 million in 2013 to $1.002.8 billion in 2018, a 13% growth rate, while Cloud API Management Services are forecast to grow from $85.6M in 2013 to $233.2M in 2018 representing a 22% CAGR.[1]

Gain Control over APIs throughout the Lifecycle

API processWithout a unified way to manage your APIs, CIOs will soon be staring down the perfect storm, having to manage an unmanageable proliferation of APIs from the anticipated exponential increase in demand, over time. With Dell Boomi‘s full lifecycle API Management service organizations can quickly create, publish and manage new APIs for their constituents. Customers can Create APIs from new or existing Boomi integration processes with full lifecycle management (versioning) through a visual experience—with no coding necessary. They can also Publish APIs on-premises or in the cloud with comprehensive security capabilities and multiple authentication options. And businesses can Manage their APIs through traffic control and monitoring dashboards.

With Dell Boomi API Management, you can rapidly convert any endpoint into a web service. In addition to creating web services from HTTP, Dell Boomi API Management lets you expose existing databases, FTP sites, legacy applications, new cloud applications, and mobile applications—as APIs.

Because it’s part of our single, multi-purpose PaaS, Dell Boomi API Management, along with Integration and MDM, helps businesses experience faster time to value—and lower training costs. At the same time, API Management delivers speed, business agility, improved security and better performance, helping you to maintain control as you transform your business, so you can continue to ensure the level of service your organization has come to expect.



[1] Source: Gartner, Market Trends: Platform as a Service, Worldwide, 2013-2018, 2Q14 Update, 19 June 2014

Dell TechCenterAnnouncing Cloud Access Manager 8.0

I'm pleased to announce the immediate availability of Dell One Identity Cloud Access Manager 8.0. This major release is a significant update to our web single sign-on, web access management and identity federation solution. And in a way, it includes a "little something for everyone" - meaning it has new capabilities in many different focus areas, including business-to-consumer (B2C) deployments, strong authentication scenarios and in-house application development.

Adaptive, Risk-Based Authentication

With all the news these days about security breaches - and so many of them involving stolen passwords - its no surprise that strong authentication (the use of two-factor authentication solutions, smartcards, X.509 certificates, etc.) is seeing a resurgence in importance. By requiring something besides a username/password to access applications, security professionals can better protect their enterprise data. At the same time, no one wants to unduly limit productivity - indiscriminately putting barriers between users and their work is an equally risky proposition. 

The best approach to employing strong authentication for web applications is to do so with an awareness of context - information like "is the user using a browser they've used before" or "is this a physical location and time of day typical to the user's login history." These context data elements can be used to assess how risky an access request is - how likely it is the person on the other end is your user, and not an attacker.

CAM 8.0 ships a new component called Dell's Security Analytics Engine, whose job it is to assess these very context elements, and provide CAM with the ability to adapt to heightened risk by asking that a user present a second factor of authentication, or by blocking the user altogether. Dell's Security Analytics Engine can work on its own, or it can optionally leverage information from complementary Dell solutions like SonicWALL network security appliances or SecureWorks threat intelligence data. And CAM can apply these risk policies to individual high-risk applications, or to the entire application environment.

Social Authentication

Users forget passwords - but not all passwords are equally likely to be forgotten. A user is much more likely to recall the AD password they to use to access the network each day than they are to remember a password to a less-frequently-visited internet site. This is why social authentication - authenticating to internet sites with Facebook, Microsoft LiveID, etc. credentials - has become so popular. Social authentication allows endusers to remember fewer passwords, and that is extremely valuable to end users.

CAM 8.0 now supports the OAuth 2.0 protocol as a client, which enables endusers to authenticate to the centralized authentication infrastructure using credentials from popular social web sites. In an important twist, since social sites seldom hold the kind of data used by organizations for determining roles and application permissions, CAM presents an "account linking" process so that authorization can be driven from internal data, while authentication can be outsourced to a password users are more likely to remember. This may be appealing, for example, to education institutions targeting alumni, or to organizations running "portal" environments for customers or partners.

Mobile Application Development

For organizations with their own IT development group, modern application development is seeing a different kind of resurgence - the resurgence of "rich client" applications. Specifically, organizations are now prioritizing the development of native mobile applications (apps that don't run in a browser, but instead natively in a mobile device OS) as first-class citizens on par with - and sometimes ahead of - web application interfaces. 

CAM 8.0 introduces support for the modern protocols used most for mobile application development, namely OAuth 2.0 (this time as an authorization server) and OpenID Connect. Unlike the Security Assertion Markup Language (SAML) protocol popular with web apps, OAuth 2.0 and OpenID Connect were developed with native apps and REST-based interfaces in mind. With this support, organizations deploying CAM can leverage their existing web application authentication infrastructure with this class of applications, as opposed to writing all the access control logic "from scratch" like they did for web applications a decade ago.

There's even more new in Cloud Access Manager 8.0, but this post is getting too long! I encourage those interested in learning more to visit CAM's web page, and to post here if you have any questions.

Dell TechCenterCongratulations to Merle Giles

Congratulations to Merle Giles, one of HPCWire's People to Watch 2015.(read more)

Dell TechCenterDell Named a Global Elite Launch Partner for Skype for Business: How our investment translates into customer success


Microsoft has named Dell a Skype for Business Global Elite Launch Partner, affirming our enduring commitment to delivering comprehensive enterprise solutions for Microsoft unified communications and collaboration (UC&C) applications. 

Teams across Dell are responsible for helping our customers navigate the path from old to new ways of using technology to improve communications with customers, colleagues and suppliers. Our mission: to deliver a greater return on  customers’ investments in Microsoft Exchange, SharePoint, Lync and the highly anticipated new version of Lync, Skype for Business.  

For example, we know that deploying a robust unified communications (UC) solution such as Skype for Business touches the entire organization, from line-of business managers, to users, to IT managers and application administrators. Maximizing return on investment means delivering the infrastructure, end-user devices, software tools, training and services for UC that help each of these customer roles be more efficient and effective.

Unlike other single suppliers, Dell offers end-to-end solutions for Microsoft UC&C. With one vendor relationship, you get:

  • A wide portfolio of servers, networking, storage, PCs and monitors optimized for Microsoft UC&C applications, to help IT deliver better performance, greater capacity, maximum scalability and an enhanced user experience
  • IT software tools to streamline deployment, management and administration--such as our award-winning Unified Communications and Collaboration Command Suite
  • Headsets, phones and room systems qualified by Microsoft for use with their UC platform
  • Software licenses
  • Consulting service to help plan, deploy, manage and support Microsoft UC&C environments.

Today, Dell has 23 Microsoft Gold competencies--more than any other Microsoft partner--a status we’ve earned by exhibiting increased levels of real world customer experience and expertise independently verified by Microsoft.    

Watch this video to learn more about how we can uniquely help you transform your communications. 

(Please visit the site to view this video)

Find more information here:

  • Read about our customers’ transformations in this ebook
  • Check for ongoing updates at Dell.com/ucc
  • Find reference architectures for Microsoft UC&C solutions in our TechCenter.

Dell TechCenterDell Software Challenges Industry to Focus Squarely on Customers

We’ve seen reports of major changes happening within some of the largest players in our industry recently. While it’s not unusual for organizations to make business changes from time to time, major upheaval and organizational restructuring – along with accompanying uncertainty – can take a serious toll on the business, both in terms of partner and customer confidence, and employee morale. I’ve worked in quite a few different companies, and navigated through some very challenging times, and I know that when the company loses its focus and becomes more intent on serving Wall Street than its customers and employees, it’s not uncommon for significant changes to take place. This is when companies lose their best, highest quality people rather quickly, which has an extreme effect on all aspects of the business, most notably on how customers can be served and supported.

Certainly Dell has undergone changes in the past year, and as we approach the one-year anniversary of our privatization, we’ve remained laser focused on our customers. As deliberate and specific moves have been made to change Dell from a PC company to an end-to-end IT company, we have consistently used our customers as the one driving force for how we evolve our business. In fact, our software portfolio is so well aligned to the needs of our customers that we can fulfill about 70-80 percent of most IT needs right now.

So, what really drives us? Helping our customers win with our solutions. This should be the key business driver for all IT providers, but, in reality, this doesn’t happen. I’ve traveled around the world recently, meeting with Dell Software customers and partners. I’ve heard stories of how our single focus on addressing their needs has helped them achieve efficiencies, innovate, and grow with changing needs, and I want to share a few examples:

  • International Relief and Development (IRD), a leading global humanitarian organization, has successfully leveraged Dell SonicWALL NSA and TZ series firewalls to safely perform its work around the world. The company saved $1 million in operational costs through fast and reliable connection and data delivery. Three thousand users in 47 counties are protected by Dell SonicWALL, which has allowed relief activities to continue unabated since 2003. View the IRD video here.
  • Denver Museum of Nature and Science is using the Dell KACE K1000 Systems Management Appliance to deliver the highest levels of endpoint security, while clearing major “Internet of Things” (IoT) obstacles caused by a rise in the number and types of devices connected to their network.
  • Housing Authority of the Cherokee Nation (HACN) needed a reliable, efficient and easy-to-use backup and recovery solution to ensure delivery of critical services. With Dell AppAssure, HACN now can manage backup, replication and recovery from a single, easy-to-use interface. HACN can recover an entire virtual server in 10-15 minutes, has reduced IT workload by 30–50 percent, and cut storage costs with a 1:3 compression ratio. View the HACN video here.

(Please visit the site to view this video)

  • Locala Community Partnerships, a U.K. healthcare provider, wanted to serve more patients and improve clinician efficiency by giving staff real-time access to records from health centers, schools and homes. Locala engaged Dell Managed Services to design, deploy and manage a Dell UCC solution, with which staff now can access email, calendars and patient data from almost anywhere using their laptops or mobile devices, and Microsoft Office 365. Employees also can utilize Microsoft Lync for video conferences from their desktops. Patient records are protected with Dell SonicWALL firewall technology, and fast performance is facilitated by Dell PowerEdge servers, Dell PowerVault storage and Dell Networking switches.

In addition to these customer stories, recent award wins have shown us further evidence that Dell is putting together a software solution portfolio offering the most recent technology developments, enabling our customers to transform, inform, connect and protect their organizations in ways that uniquely fit their requirements. Dell’s Unified Communications Command Suite just received the TMCNet/Internet Telephony Unified Product of the Year award, and Dell AppAssure DR6000 was named a 2014 Backup Product of the Year by Storage Magazine. Additionally, Dell Software’s security and compliance solutions were recognized as finalists in 13 categories of the InfoSecurity Global Excellence Awards

We’re passionate about providing our customers with innovative software solutions that address their business and IT needs, and we believe it should be industry standard to consider how every change – whether to business models, portfolios, channel, services, financing or support – will affect them. I’m proud of our hard work, and of the recognition our team has received in the last couple of months. By listening to our customers and staying focused on their needs, we know we’re staying on the right track.

If you have any interesting stories about the value you have seen from a strategy focusing squarely on customers, I would love to hear from you at dave_hansen@dell.com

Dell TechCenterDell Services is excited to be part of CeBIT 2015

Showcasing SAP Solutions, SAP Services and customer success stories.

Meet with the Dell team at CeBIT 2015—going on now in Hanover, Germany—to learn how your organization can overcome the challenges impeding business growth and consuming resources. You’ll also learn how you can increase market share, while improving your customer experience by adapting to shifting market demands.

CeBIT LogoDell can help stretch the value of your IT investment with our extensive knowledge of infrastructure solutions, business processes, packaged applications and proven consulting methodology. We provide solutions and services that range from a strategic roadmap to tactical business process and application management.


With Dell expertise on SAP, you can do more. At CeBIT 2015 you can learn about:

  • SAP HANA solutions - In-memory and Big Data – from ideas up to realization
  • SAP Infrastructure - Flexible and scalable with best-in-class TCO
  • Dell ZeroIMPACTTM Migrations for SAP Platforms
  • Further value added Dell services for SAP – Shop Floor Digitization, SAP Optimization, Mobile Solutions, Business and Predictive Analytic

Meet us at Booth #C04 in Hall 4 and let's find out together how you can transform data into valuable knowledge and reflect individual requirements more quickly in your IT systems.

Arrange a meeting now 

Dell TechCenterMigration Suite for #SharePoint Now Available in Microsoft #Azure Marketplace

SharePoint migrations are complex and time-consuming, and native tools for SharePoint migrations often leave behind essential information, such as metadata, permissions, and workflows. They’re also very limiting when it comes to migrating SharePoint...(read more)

Dell TechCenterGetting to "Yes" with Dell end-to-end IT infrastructure and OpenManage systems management

Yes, companies of all sizes can have IT all:

Yes, Dell Future-Ready IT is ready now and companies are choosing Dell to replace their aging IT infrastructure, more efficiently manage their IT environments, and to accelerate positive business results.   

Yes, Dell PowerEdge servers have provided open-standards, award-winning solutions for our customers for years, and Dell recently added the industry-lauded FX2 converged platform with additional inherent, powerful, and power-saving, management capabilities provided by Chassis Management Controller

Yes, Dell OpenManage systems management sets a gold standard, yet has still managed to become more relevant, simplified, automated, and even mobile today! 

Yes, in addition to local and remote management available through iDRAC with Lifecycle controller, the embedded intelligence in PowerEdge servers, OpenManage Essentials and OpenManage Mobile  also monitor and manage Dell storage, networking, firewall appliances, and third party systems too –from anywhere, at any time.  Data center management now travels with IT Administrators on an Android or iOS mobile device, when and where convenient, or necessary.  

(Please visit the site to view this video)

Yes, Dell understands that many customers have invested in 3rd party systems management consoles.  They too can experience all the benefits of Dell’s agent-free server lifecycle management capabilities simply by utilizing OpenManage Integrations for Microsoft System Center, VMware vCenter or BMC BladeLogic with their existing IT management solutions. 

Yes, Dell engages with thousands of organizations each year who look to Dell for end-to-end enterprise IT solutions, and after implementation, tell us how our technology is working for them and enabling them to improve services to their customers.  In their words:        

  • Distributech (Canada): “…Dell provided switching, servers, storage, firewalls and backup… and Distributech now saves hundreds of hours a year of IT staff time with OpenManage Essentials, OpenManage Mobile, and OpenManage Integration Suite for Microsoft System Center.”
  • China University of Geosciences: “... deployed an HPC solution based on a Dell PowerEdge VRTX shared infrastructure platform and Dell Networking switches to power its research…and has been able to improve efficiency in managing IT by 50 percent through use of the Dell Chassis Management Controller (CMC) module and embedded OpenManage technology.”
  • MakeMyTrip (India): “…chose Dell PowerEdge servers and Dell storage arrays to consolidate their private cloud infrastructure.  OpenManage Essentials provides a single view into all three of their data centers, providing centralized discovery, inventory and management that has reduced the maintenance effort by 50 percent.” 
  • And most recently, a large United States financial institution found our Dell FX2 converged system to be the perfect solution for their next-generation virtualization platform and OpenManage Essentials and iDRAC with Lifecycle Controller a huge boost in comparison to the competitors’ tools that provided them little useful functionality. Financial savings accrue rapidly now that this G500 account can quickly perform one-to-many provisioning pushes and automated bare-metal installs which previously had to be done manually.  

Getting to “Yes” is truly a “Win-Win” for our customers, their customers, and our partners, with Dell end-to-end IT infrastructure & OpenManage systems management.


Dell TechCenterIs This “Old School” Backup Method Right For You?

Do you have Microsoft Exchange?

Is it critical to your organization?

If you answered “Yes” to these questions, whether you work in IT or the business side of the enterprise, you get it; your company’s in a bind when Exchange is temporarily unavailable. The situation becomes far more grim in the “doomsday” case where vital Exchange data is lost and cannot be recovered. These ideas might help.

It’s a given that you need Exchange to be always-on, always there, and always working. And you need data to be quickly recoverable, whether we’re talking about a single email or a whole Exchange database.

Needless to say, protecting Microsoft Exchange is key. There are several ways to accomplish these all-important backups, each with its own specific benefits and drawbacks. 

Over the next couple of weeks, we’ll be exploring all the ways to protect Exchange, exploring pros and cons, thinking through implications, and reflecting on best practices.

Today, let’s take a closer look at offline backups.

If you’ve worked in IT for any length of time, chances are you’re quite familiar with offline backups. This is an “old-school” backup technique; it’s been around for a long time. And for some business situations, it might work just fine. But it does have substantial drawbacks, and for many firms, it’s very inadequate.

The offline backup, like the name implies, involves taking Exchange “offline”. You dismount the database by stopping the Exchange Services, perform the backup, either manually or as part of the scheduled automatic backup procedure, and then restart the Exchange services to bring Exchange back on line.

Beware: Big Drawbacks Ahead!

The offline backup method can serve a useful purpose for you – it can give you the backup you need. But, as you may have already guessed, it has some big-time drawbacks and limitations:

Exchange is unavailable during the offline backup procedure – For a small operation that has specific daytime business hours, this may not be a problem. But for a large company with an around-the-clock schedule, this presents a major challenge. Why? Your “backup window” during which you can afford to have Exchange unavailable to you and your company is either very small or nonexistent.

A substantial amount of data is at risk – Think about it: If you rely on the offline method and have a very small backup window that happens once every twenty-four hours, and your employees are sending and receiving a lot of email during the day, that represents a bunch of data that’s at risk.

What if you accumulated a full day’s worth of emails that had not yet been backed up, and your Exchange Server crashed at the end of the day? You could have big trouble on your hands, right?

Slow recovery time – When you restore an Exchange database, expect it to take some time, especially if it’s a big file. Also, think about the environment in your office during this time. Exchange will have to be offline, and you could face the wrath of some very impatient coworkers. Something to consider…

Automated backup verification – Traditionally, backups have been notoriously unreliable. Maybe you’ve experienced this situation before: you run a backup thinking everything is OK with it, and then you need to restore a file and find out it’s not available.

Not only is this frustrating, it can cause a big problem for your business and could even cost you your job.  A key drawback of offline backups is that they don’t offer automated verification.  You’ll have to run a manual recovery of a file to verify the validity of the backup data.

A Free, Valuable Resource To Guide You

As you can see, offline backups have some pretty big limitations. In some situations, however, they may be all a company needs. It’s important that you do an analysis of your particular situation to determine what backup method is best for you.

We can help you. We’ve put together an eBook called Six Ways To A Smarter Microsoft Exchange Backup. It evaluates six distinct approaches for protecting your Exchange data. It will help you determine which one is right for you. Download your free copy now!

 

 

 

 

 

Dell TechCenterLessons from TDWI: Want Analytics with Your Fries?

Fries

At the risk of aging myself, I’ve probably attended 70 or so TDWI conferences and executive summits, but the recent TDWI Las Vegas was different. It marked the organization’s 20th anniversary and reinforced the importance of analytics, which was a big topic among attendees, speakers and vendors.

As a TDWI faculty member for the fourth consecutive year, I taught a whole-day class on social analytics, which focused on driving business values with big data. But more on that subject in my next post. In this blog, I’d like to step back and address changing analytics dynamics.

This vital area has come a long way from its roots in data management, reporting and BI. I reminisced with TDWI president Steven Crofts about all the changes that have taken place over the years. It was fun to remember when TDWI, aka The Data Warehouse Institute, was about innovations in data processing and warehousing. Fast-forward two decades: We’re talking about big data, social analytics, machine learning and cognitive computing.

The same quantum leap holds true when talking about analytics, which has evolved into highly automated, somewhat transparent solutions for ingesting, integrating and leveraging vast amounts of information. As analytics mature, organizations of all types are looking at how they glean greater business value. Some industry segments, like pharmaceutical, manufacturing and retail, are ahead of the curve because market disruption has led to adopting new analytical capabilities and advanced workloads to produce real-time fraud and risk analyses as well as quality control and other complex insights.

Many companies I spoke with are in the midst of retooling their analytics foundations to be more successful. Others are making bold moves. The manufacturer of an industrial French fry maker is using sophisticated analytics and sensors to better monitor equipment vitals (e.g., the filter is dirty, heating element isn’t hot enough, etc.). In doing so, they can assure their restaurant customers that their equipment is performing optimally.

It doesn’t stop there. They also want to listen to the social signals of their customers’ customers—the folks eating fries—to better understand satisfaction levels through trend analyses. If they learn through social analytics that customers complained about substandard fries at their customers’ establishments, they could proactively help the restaurant take action. As a result, this company will be able to differentiate themselves by helping their customers before something hurts their brand. How cool is that?

Regardless of where companies are in the analytics adoption curve, there are major drivers accelerating change across the entire data landscape. As business analytics mature, there will be continued movement along these four pressure points:

  1. The user community is changing. Companies must deal with a new and growing set of users. You no longer need to be a major nerd or a Ph.D. to drive business analytics. I call this “the convergence of the suits and hoodies.” Thanks to the Google generation, we now have users with a very different view of how to gain access to information and why. The “hoodies” are all about collaboration; the “suits” have a more traditional data management view. Together, they are driving significant changes in how companies maximize data value. Both perspectives need to be taken into consideration.
  2. The business side now runs the show. IT used to be in charge of most analytics projects because the business side had to rely on IT to supply whatever was needed. While IT still plays a vital role in enabling technology, the shoe is on the other foot. Business has a louder voice in expressing what they want and need as well as how they leverage data to produce actionable business insight.
  3. New economic and technology advantages drive innovation. Open source solutions like Hadoop can make business analytics more affordable and therefore more accessible as does running diverse workloads on commodity hardware. Moreover, new technologies, such as in-memory analytics, faster storage and compute capabilities as well as cloud and distributed architectures, are making a big difference by delivering increased speed and scalability to perform automated and advanced analytics.
  4. New data types produce more mature views of information. For the past 20 years, companies have looked longingly at some data they wished could be incorporated but couldn’t because the technology wasn’t there or it was cost prohibitive. Now those barriers are breaking down, enabling organizations to add social, machine and sensor data for diverse and interesting data perspectives.

New types of data can serve a wider community of users who will want to mix, match and mash up information to yield value in responding to business needs. This will lead to a more mature mantra, from any company looking to innovate: “Put the right data, on the right platform, at the right time, for the right workload.”

In my next post, I’ll offer more insights from TDWI by sharing experiences and more real-world examples from my class, “Social Analytics: Driving Real Business Value with Big Data.” Until then, drop me a line at Shawn.Rogers@dell.com to share your mantra for business analytics.

Dell TechCenterFoster productivity in your employees’ preferred environment

Your office is no longer defined by a desk within your employer’s walls. You might have the flexibility to work from home, client offices, coffee shops or other locations. But where are you most productive? Employee effectiveness in diverse environments...(read more)

Dell TechCenterA Single Approach to Anypoint Systems Management — Balancing Privacy and Productivity

How are you keeping your end users happy and productive in the age of mobility and BYOD?

In my last post, I described the Dell KACE-Dell Enterprise Mobility Management combination for bringing all of your organization’s traditional, mobile and BYO devices under one umbrella in a single systems management product.

“That’s all right for the devices,” you say, “but what about the people using them? How does Dell KACE-Dell EMM deal with issues of privacy and productivity?”

Fair enough. After all, what good is it to address IT’s concerns for tracking and managing all connected devices if it only leads to resistance among users worried about privacy on their personally owned devices?

Balancing privacy and productivity

Don’t forget that the main reason for putting all your blood, sweat and tears into a BYOD initiative is that users can be more productive when they use their own devices. The freedom to use the corporate network as productively as possible lets users take advantage of the apps, devices and means of communication they know best. But in the same way that users expect the organization to respect personal privacy, they also expect it to respect the privacy of personal data on their own devices.

The balance, then, is to separate business and personal use clearly and securely right on the device. The Dell KACE-Dell EMM combination for anypoint systems management strikes that balance with the Dell Mobile Workspace app that users install on their own smartphone or tablet and the Dell Desktop Workspace application for their BYO laptop. Other than installing the secure workspace on their personal device, where they can securely access corporate resources and do their work, users don’t see any difference in their devices. IT has access only to the workspaces on their personal devices and users can be assured that personal apps and data are safely partitioned and completely inaccessible by IT. Workspaces also assure IT of the same level of security desired for corporate-owned assets, with features like encryption, secure remote access, firewall and prevention of data loss for the corporate information residing within the workspace on a user-owned device.

In my next post I’ll have more details on how anypoint systems management looks from the IT perspective, so stay tuned. Meanwhile, have a look through Part 2 of our new e-book, “A Single Approach to Anypoint Systems Management,” right now.

Dell TechCenterMichael Dell and Marc Benioff Face Off in Fitbit Celebrity Challenge

There’s not much that Salesforce CEO Marc Benioff and I have in common, but there is one thing -- we both lost 30 pounds wearing a Fitbit.

Actually, today I’m down 32 pounds, but more importantly I’m much more active and healthy than I was before the fitness tracker gave me a wakeup call about just how much I was sitting in front of my computer. To help users keep motivated, Fitbit launched challenges last August. I recently competed with several friends in a “Workweek Hustle” challenge and can attest to the fact that it accomplishes that goal.

At the beginning of the year, Fitbit extended the challenges to celebrities, and I’m excited to see that our own Michael Dell is stepping up to face Benioff in Fitbit’s third Celebrity Challenge March 16-20. The challenge is to see which CEO can log the most steps in five days to raise awareness and money for the American Heart Association (AHA). The winner will have $10,000 donated in their name to the AHA from Fitbit.

It’s no secret that both men already had a bit of rivalry going with their fitness efforts. Last year, Benioff noted during a World Economic Forum panel how Dell noticed when a few days passed without any activity on his Fitbit.

“The reason I’m telling this story is so that you can get an idea of where we are going in the world, with everything getting connected,” he said when he brought it up again at the recent Fortune Brainstorm Tech dinner.

“Until now, most of us have made our health and fitness decisions based on what we think we know about ourselves,” said Michael Dell in a Q&A about his participation in the Fitbit Challenge.“Advancements in technology – wearables and otherwise – will eventually take much of the guess work out of healthy living. And we’ll all benefit from it.” 

You can follow the action at www.fitbit.com/celebrity-challenges and on social media through #FitbitForAHA; or you can get involved and add to Michael’s total step count by donating to AHA via www.fitbit.com/celebrity-challenges. Every dollar donated in his name equals 10 additional Challenge steps.

Dell TechCenterFuture-ready managed private clouds

Today’s amazing mix of cloud computing, ever-smarter mobile devices and collaboration tools opens both an opportunity and challenge for federal government agencies. New expectations and ever-present and changing security needs require the federal government to be ready to deliver and receive digital information and services anytime, anywhere and on any device. Digital-savvy government workers demand easy-to-use ways to stay connected and access data securely so they can be productive. In addition, federal agencies are looking for better ways to effectively use massive amounts of data, creating valuable insights that advance their mission – all within current or declining budgets.

Cloud computing plays a key role in building a 21st century platform for federal agencies to better serve the American people. Cloud enables agencies to capitalize on innovative technologies and services that simplify IT operations, control costs and drive efficiencies.

A frequent question I hear from our federal customers is how do we accelerate the transition to the cloud with constant budget and IT staff restraints as well as legacy constraints?  As with most questions regarding cloud, there are several answers.  In this blog post, we’ll take a look at one of those options: Managed Private Cloud

Accelerating your path to a secure, federally compliant private cloudDell-Managd Private Cloud

When stealth is the name of the game, a managed private cloud rewrites the rules of driving your mission needs forward. It allows you to leapfrog to cost-effective, scalable cloud technology without upfront hardware or software costs. No longer are you bogged down with worries about scarce capital, long lead times to implement new IT capabilities or a continuous drain on IT staff and resources. Managed private cloud-based computing models swing the door wide open to revolutionize the way your agency creates value.

The DNA of a managed private cloud is an agency’s dream:

  • No high up-front investment
  • Fast lead-time and deployment
  • On-demand IT resource scalability
  • Secure and reliable performance

What’s the right fit for you?

Some of the key attributes you should seek when pursuing a managed private cloud solution include:

  • Location agnostic – Ensure you have options to host your sensitive data in a managed private cloud at your facility or within a secure data center off-site or a combination of both, depending on your agency requirements.
  • Safe and secure – Demand a security framework with automation and continuous monitoring to make the cloud functional. Must meet National Institute of Standards and Technology (NIST) 800-53 security controls to support FISMA, DIACAP and FedRAMP* regulations as well as provide authority to operate (ATO) management capabilities.
  • Flexibility – In a multi-cloud world, make sure your provider offers the same or compatible technology in a dedicated/private cloud as well as a public/community cloud. This allows you to integrate legacy applications with low latency requirements while you migrate other cloud workloads to a FedRAMP public cloud.
  • Future-ready – Choose core cloud-enabled technology that features integrated, inter-component compatibility (hardware, servers, storage and networking) that operate seamlessly and consistently across multiple cloud environments. It should have on-demand integration capability to easily connect any combination of cloud, software-as-a-service (SaaS) or on-premise applications.
  • Seamless management control – Request management tools that can easily manage multiple cloud environments, multiple ATOs and security requirements from a single view.

Dell Services Federal Government believes your journey to a managed private cloud should be smooth and seamless. That’s why we’ve taken out the obstacles and created a simple approach to enable a quick transition to the cloud. Our NIST Dell-managed private cloud was designed from the ground up for the federal government to enable you to confidently move workloads to a federally compliant and secure NIST Dell-managed private cloud. Everything you need – on-demand capacity scaling, safe data sharing across cloud environments, built-in NIST 800-53 security controls, FISMA and DIACAP compliance, continuous monitoring, quick access to ATO certification and full control to manage multiple clouds – is all standard.

Choice . . . the heart of Dell’s cloud strategy

With our government customers, choice is a key factor in providing the right fit as many agencies require flexible combinations of cloud delivery models with both on-premise and hosted alternatives to meet diverse agency objectives and specialized federal government security and compliance requirements. A NIST Dell-managed private cloud is just one example to accelerate your transition in harnessing the power of the cloud.

Let us partner with you on that journey. For more information, visit the Dell Federal Services website or contact Neil_Seiden@federal.dell.com.

*  Federal Information Security Management Act (FISMA)

    Department of Defense Information Assurance Certification and Accreditation Process (DIACAP)           

    Federal Risk and Authorization Management Program (FedRAMP)

See also: Cloud as an Enabler of IT Transformation and Innovation

Jason BocheVMware Horizon View Agent 6.1.0 Installation Rollback

With the release of vSphere 6 last week, I decided it was time to update some of the infrastructure in the home lab over the weekend. I got an early start Friday as I had my three remaining wisdom teeth pulled in the AM and took the rest of the day off work.  Now I’m not talking about jumping straight to vSphere 6, not just yet.  I’ve got some constraints that prevent me from going to vSphere 6 at the current time but I expect I’ll be ready within a month or two.  For the time being, the agenda involved migrating some guest operating systems from Windows Server 2008 R2 to Windows Server 2012 R2, migrating MS SQL Server 2008 R2 to MS SQL Server 2012, and updating templates with current VMware Tools, and tackling VMware Horizon View getting Composer and the Connection Server migrated from version 5.3 to 6.1.0 including the pool guests and related tools and agents.

I won’t bore anyone with the details on the OS and SQL migrations, that all went as planned. Rather, this writing focuses on an issue I encountered while upgrading VMware Horizon View Agents in Windows 7 guest virtual machines. For the most part, the upgrades went fine as they always have in the past. However I did run into one annoying Windows 7 guest VM which I could not upgrade from View agent 5.1 to View agent 6.1.0. About two thirds of the way through the 6.1.0 agent upgrade/installation when the installation wizard is installing services, a ‘Rolling back action‘ process would occur and the upgrade/installation failed.

The View agent installation generates two fairly large log files located in C:\Users\\AppData\Local\Temp\.  I narrowed down the point in time the problem was occurring in the smaller of the two log files.

svm: 03/16/15 10:54:52 — CA exec: VMEditServiceDependencies
svm: 03/16/15 10:54:52 Getting Property CustomActionData = +;vmware-viewcomposer-ga;BFE;Tcpip;Netlogon
svm: 03/16/15 10:54:52 INFO: about to copy final string
svm: 03/16/15 10:54:52 INFO: *copyIter = RpcSs
svm: 03/16/15 10:54:52 INFO: newDependencyString = RpcSs
svm: 03/16/15 10:54:52 INFO: *copyIter = vmware-viewcomposer-ga
svm: 03/16/15 10:54:52 INFO: newDependencyString = RpcSs vmware-viewcomposer-ga
svm: 03/16/15 10:54:52 ERROR: ChangeServiceConfig failed with error: 5
svm: 03/16/15 10:54:52 End Logging
svm: 03/16/15 10:54:53 Begin Logging
svm: 03/16/15 10:54:53 — CA exec: VMEditServiceDependencies
svm: 03/16/15 10:54:53 Getting Property CustomActionData = -;vmware-viewcomposer-ga;BFE;Tcpip;Netlogon
svm: 03/16/15 10:54:53 Cannot query key value HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\DependOnService for size: 2
svm: 03/16/15 10:54:53 Cannot query key value HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Netlogon\DependOnService for size: 2
svm: 03/16/15 10:54:53 End Logging

In addition, the Windows event log reflected Event ID: 7006 “The ScRegSetValueExW call failed for DependOnService with the following error: Access is denied.

I had made a few different attempts to install the 6.1.0 agent, each time trying a different approach. Checked registry permissions and dependencies, relaxed registry permissions, enabled auditing, temporarily disabled Avast Antivirus, etc.  The VMware Horizon View Agent installs a handful of components. Although I didn’t know yet what the issue was on the OS, I had the problem narrowed down to the VMware Horizon View Composer Agent portion of the installation which installs VMware Horizon View Composer Guest Agent Server service (vmware-viewcomposer-ga is the name of the service if you’re looking in the registry).

After doing some more digging, I found out that some antivirus applications like Panda have a a self-preservation mechanism built in which can cause unexpected application problems. Avast has one as well and it’s called the avast! self-defense module. This defense mechanism works independently of normal real time antivirus scans which I had disabled previously.  I had never run into a problem with Avast in the past but in this particular instance, Avast was blocking the modification of Windows services and dependencies. The easy solution, and I wish I had known this from the start but I don’t invest much time in antivirus or malware unless I absolutely have to, was to disable the avast! self-defense module which can be found in the Troubleshooting area of the Avast settings.

Once the avast! self-defense module was disabled, the installation of the VMware Horizon View Agent 6.1.0 agent, including the VMware Horizon View Composer Agent portion, completed successfully. After the agent installation completed, a reboot was performed and I re-enabled the avast! self-defense module.

Thus far I’m impressed with VMware Horizon 6.1. Not much has changed from UI/management perspective but stability and cleanup within Composer operations has improved. I built up and tore down a 28 Windows 7 guest VDI pool and whereas this has lead to precarious pool states and manual cleanup steps in the past, it has worked flawlessly so far.  I’m definitely looking forward to the jump to vSphere 6 infrastructure in the coming weeks. All but one of the other lab infrastructure components have been upgraded and are ready at this point so it shouldn’t be much longer until I have vSphere 5.x in my rear view mirror.

Post from: boche.net - VMware Virtualization Evangelist

Copyright (c) 2010 Jason Boche. The contents of this post may not be reproduced or republished on another web page or web site without prior written permission.

VMware Horizon View Agent 6.1.0 Installation Rollback

Dell TechCenterMolecular Dynamics Benchmarking with Intel Xeon Phi 7120P Coprocessors

This blog evaluates NAMD performance with 7120P coprocessors on PowerEdge C4130 Server.(read more)

Dell TechCenterWhy do you need a systems management solution and which one is the best for you? – New White Paper from EMA has the answers

IT environments are becoming increasingly more diverse and complex, and consequently harder to manage. Mobility, along with more and more smart devices (i.e., the Internet of Things) has led to a significant increase in the number and types of devices that are connected to corporate networks; devices that you must now track, manage and secure. If your organization previously felt that it could get along without a comprehensive systems management strategy, you are likely now feeling the pressure to find a comprehensive “anypoint” systems management solution, one that is both easy to use and addresses all of your concerns.

Whether you’re in the market for your first systems management lifecycle solution or you’re looking to upgrade your existing one, you’ll find a valuable resource in “Best Practices in Lifecycle Management,” the new comparative paper from Enterprise Management Associates, Inc. (EMA). I’ll cover highlights of the paper in this series of blog posts, starting with this post on the most important criteria for evaluating systems management products.

Covering the main disciplines in systems lifecycle management

The systems lifecycle management spans all the processes you need to get the most productivity and efficiency out of your IT spend. These processes have evolved and expanded over time and you need to address all of them if you want to sleep soundly at night without worrying about compliance violations, security vulnerabilities, managing the ever-increasing number of mobile and non-computer devices on your network, or lost productivity due to system performance issues:

  • Asset Management – Discovering hardware and software assets and managing the asset lifecycle
  • Software Distribution and Provisioning – Installing software from a central location
  • Security and Patch Management – Automating patching and security management
  • Configuration Compliance and Remediation – Centralizing systems configuration and enforcement of policies
  • Process Automation – Automating and integrating IT management processes
  • Service Desk – Providing a service desk, integrated with asset management and reporting
  • Reporting – Comprehensive and customizable reporting and alerts

In light of those and other criteria, EMA conducted extensive research and evaluated four of the most popular lifecycle management solutions:

  • Dell KACE K1000 Systems Management and K2000 Systems Deployment Appliances
  • LANDESK Management Suite 9.6 SP1
  • Microsoft System Center 2012 R2 Configuration Manager (SCCM)
  • Symantec Altiris Client Management Suite (CMS) 7.5 SP1

You’ll take away a clear picture of how to shop for lifecycle management products. You’ll see how to start narrowing down the checklist of features and capabilities that will make the biggest difference in managing the desktops, laptops, servers and non-computer devices in your organization.

Download the white paper

Get your free copy of “Best Practices in Lifecycle Management” for EMA’s detailed comparison of over 45 features across seven main areas of lifecycle management.

Dell TechCenter2015 Priority – Better Backup and Recovery For the DBA


 Last week we reviewed the first critical step to SQL Server database management – assessing your current environment to develop a map guiding a management overhaul. Having the most comprehensive view of the database helps inform design and decisions on backup and recovery policies – the second essential step in optimized management.

SQL Server features high-availability capabilities including:

  • AlwaysOn Failover Cluster Instances: Provide server-instance level redundancy
  • AlwaysOn Availability Groups: Enterprise-level solution to maximize availability for one or more user databases
  • Database Mirroring: Increase availability of a SQL Server database by supporting almost instantaneous failover.
  • Log Shipping: Functioning at the database level, provides automated backup and restore.

Want to get your SQL Server environment to work for you – instead of the other way around? Click here to download the “The Essential DBA Playbook for Optimized SQL Server Management.”

Since high-availability capabilities exist, some DBAs get tripped up thinking disaster recovery is covered as a result. This is a common misconception.

In actuality, high availability and disaster recovery are not synonymous.

High availability refers to the processes put in place to ensure applications and services running on a database remain up, while disaster recovery refers to retaining the data within SQL Server in the event of an outage.

The need for a disaster recovery prompts DBAs to review their backup and recovery strategies. Backup and recovery go hand and hand. There is no way to simply implement a recovery strategy, which typically takes a lot of effort and a multi-phase approach, without first assessing backup processes. Download the “The Essential DBA Playbook for Optimized SQL Server Management” to learn more.

Dell TechCenterSimplify administration and management with new vSphere Virtual Volumes

Finally vSphere Virtual Volumes is here! It has been a long road since VMware first spoke about Virtual Volumes all those VMworlds ago, but trust me, it has been well worth the wait. Virtual Volumes is probably the biggest change to the ESXi storage stack since VMware’s inception and the biggest change to SANs since storage virtualization freed LUNs from specific disks. We, at Dell Storage, have been working closely with VMware to support this integration and management framework.  Customers will get their first opportunity to use Virtual Volumes integration when we release our Dell EqualLogic PS Series Array Software 8.0 this spring.

What exactly is Virtual Volumes? In an earlier blog post, I described the granularity of Virtual Volumes, and the shift from volume level data services to data services being applied at the virtual machine level. Let’s build upon that, and understand how Virtual Volumes with the PS Series can change your day to day tasks by allowing you to:

  • Create granular Storage Profiles that meet your needs from the individual storage capabilities advertised by the array
  • Deploy new virtual machines in under five seconds, no matter how big
  • Take hardware based snapshots of individual virtual machines, keep them as long as needed, and, most importantly, restore them almost instantly
  • Take advantage of other new features releasing post-launch like VMDK I/O metrics with EqualLogic SAN HQ

The benefits of faster virtual machine deployment and long life snapshots are straight forward, but the changes in Storage Profiles is less obvious. With VASA 1.0, Storage Profiles was built upon a concatenated string of capabilities advertised by the array.  With VASA 2.0, which Virtual Volumes uses, you get to choose the individual capabilities that are important to you for your storage profiles. For example, you could create a Bulk Storage Profile built from the following capabilities advertised by the PS Series array: RAID Type “RAID6 or RAID60 or RAID50” and Disk Speed “5,400rpm or 7,200rpm.” As long as the underlying PS Series storage consisted of one of those RAID types and one of those disk speeds, the policy would be compliant.

Let’s look at a second example. We’ve created a Secure Storage Profile with just the capability of Disk Encryption set at “Yes.” In this case, it does not matter what RAID Type or Disk Speed is in use, as long as Disk Encryption is enabled, the policy would be compliant. In summary, pick only the capabilities that you need from the image below, and ignore the rest.

Storage Profiles with Dell EqualLogic Virtual Storage Manager 4.5

We know not everyone has a spare PS Series array, or the time to stand up a vSphere environment in order to understand a new feature, no matter how world-changing it is. So, we went ahead and did it for you. The lab we built for VMworld 2014 continues to be available from our friends over at VMware’s Hands on Labs.  So, come play in our sandbox, and gets hands-on with Virtual Volumes on Dell EqualLogic PS Series arrays. Please note this lab was built from pre-release software builds, so you should expect several improvements in our final version, including a few GUI changes. Also, the VMware HOL is a collection of virtual machines, so you should anticipate even better performance in a real-life scenario.

Early customer feedback has been quite positive.  One of our customers, who used the HOL lab, Maurizio Davini, Technical Coordinator, of the IT Center at the University of Pisa, shared with us that “Operating in a Virtual Volumes VM-aware environment greatly simplifies administration and management compared to the traditional LUN-centric environment.”   These are the key benefits of Virtual Volumes, and we believe others users also will agree.

For more information on Dell Storage PS Series integration with Virtual Volumes, follow me on @Dell_Storage.  I will be sharing best practices and videos in the coming weeks. Stay tuned!

Footnotes