Samuel Greengard, Author at Datamation https://www.datamation.com/author/samuel-greengard/ Emerging Enterprise Tech Analysis and Products Tue, 24 Jan 2023 14:56:00 +0000 en-US hourly 1 https://wordpress.org/?v=6.2 Best data science software and tools https://www.datamation.com/big-data/data-science-tools/ Sat, 19 Jun 2021 01:18:08 +0000 https://www.datamation.com/?p=21338 Why Should You Use Data Science Tools?

Data science has transformed our world. The ability to extract insights from enormous sets of structured and unstructured data has revolutionized numerous fields — from marketing and medicine to agriculture and astronomy. Drawing on mathematics, statistics, computer science, information science and other areas, data science uses mathematical formulas and algorithms to transform mountains of raw data into useful information. 

Today, businesses, government, academic researchers and many others rely on data science to tackle complex tasks that push beyond the limits of human capabilities. Within the enterprise, it’s increasingly paired with machine learning (ML) and other artificial intelligence (AI) tools to ratchet up insights and drive efficiency gains. For example, it can aid in predictive analytics, making Internet of Things (IoT) data actionable, developing and modeling new products, spotting problems or anomalies during manufacturing and understanding a supply chain in deeper and broader ways.

In the past, data science has required specific expertise. However, today’s data science software platforms are increasingly designed for use among business analysts and other citizen data scientists. Nevertheless, they approach tasks in remarkably different ways — and use different methods to aggregate data, process it and generate actionable reports, graphics or simulations. 

How Do Data Science Tools Differ?

Some software applications focus on building elaborate models and require advanced coding capabilities. These platforms may also require specialized hardware or other systems. Others use R or Python to execute model code — but don’t support other programming languages that would expand the flexibility of the platform. Still others offer only drag-and-drop functionality. It’s possible to build models simply by manipulating objects on a computer screen and that’s the limit. 

As a result, it’s important to thoroughly understand what your organization’s needs are, what type of data science methods and approaches are best suited to your requirements and which vendors are the best fit for your industry and business model. This includes whether the software will be used by business analysts, data scientists or both and what each vendor has to offer in regard to pricing, a product road map and service and support. 

How to Select a Data Science Software Platform

Here are some key questions to focus on if you’re in the market for data science software:

  • What do you need the data science software to do? Not surprisingly, some platforms are better suited to certain types of tasks or certain industries and fields. They may capture or ingest data from specific sources, process and maintain data in a specific way and include features that enable certain types of analytics or modeling or support technology frameworks such as IoT.
  • Who will be using the software and what level of technical expertise do they have? While most of today’s platforms can be used by business analysts with some level of data science knowledge, data science solutions differ greatly. Some require deeper knowledge of statistical modeling methods, machine learning and programming languages. Others are rooted more in traditional business intelligence (BI) and may require some knowledge of Oracle Data Miner or SQL. 
  • How well is the solution designed and how good of a fit is it? As with any type of software, it’s critical to focus on the user interface (UX), features and the ability to adapt models for different purposes. Make sure a product has the flexibility your organization needs and it’s a good match for your level of expertise and overall objectives. 
  • What types of data reporting and visualization tools does the solution provide? Some platforms focus on data reporting and business intelligence, while others revolve around elaborate visualizations. These tools might also touch different areas of data science, including qualitative analytics, predictive analytics, regression analysis or text mining. 
  • What does the solution cost? Pricing varies significantly among vendors. It isn’t unusual for a solution to cost $2,000 or more per month per user, and a few charge more than $50,000 a year per seat. However, many vendors are moving to a more flexible and OPEX-friendly SaaS tiered pricing model. There also are low-priced options for SMBs, such as Microsoft Excel or open source applications that aren’t included in this roundup.
  • What is the vendor’s road map and what is its commitment to support? The field of data science is evolving rapidly. Once highly technical domains such as machine learning and deep learning are appearing in solutions. It’s critical to know where the vendor is headed and what its commitment is for supporting the platform. Also, understand the service level agreement (SLA) before signing on the dotted line.

See more: Structured vs. Unstructured Data

Top 10 Data Science Software Solutions

Alteryx

The widely used platform combines powerful analytics, data science and process automation within a single low-code/no-code environment. It incorporates machine learning and other AI methods to deliver geospatial analytics, prescriptive analytics and numerous other outcomes via visual dashboards, files and apps. 

Pros

  • Offers powerful but easy-to-use features for business leaders.
  • Integrates with 80+ data sources and outputs to numerous tools from Microsoft, AWS, Snowflake, Tableau and Salesforce.
  • Provides more than 300 no-code building blocks that facilitate data models and automation.
  • Highly rated customer support.
  • Large and robust user community.

Cons

  • Low-code environment means it may not be customizable for complex data science projects.
  • Expensive.
  • Some users complain about complexity of workflows.
  • The platform doesn’t fully support mobile use, including Android and iOS. 
  • The desktop version places heavy demands on systems.

Dataiku DSS

The solution offers a platform for data science and machine learning. It’s especially suited to multidisciplinary teams comprised of both data scientists and business users. Dataiku is available in cloud/SaaS, Windows and Mac desktop versions. It incorporates strong data visualizations, deep learning, machine learning, algorithm libraries, natural language processing and predictive modeling/analytics capabilities.  

Pros

  • Powerful no-code tools are ideal for non-data scientists.
  • Ranked as a “leader” in Gartner’s 2021 Magic Quadrant for Data Science and Machine Learning Platforms.
  • High user ratings for the interface as well as collaboration features.
  • Broad and innovative support for business metrics that extend beyond model accuracy.

Cons

  • Heavy reliance on extensions and plugins can add overhead and complexity.
  • Pricing for versions without full enterprise capabilities are high and features are limited.
  • Limited support for mobile devices.
  • Some users complain that it’s difficult to configure.

H2O.ai

The vendor offers an end-to-end data science platform that’s designed to democratize artificial intelligence. H20 AI Hybrid Cloud supports “explainable” models that work across a wide array of industries and use cases. The open-source predictive analytics platform is designed for both data scientists and citizen data scientists. 

Pros

  • Intuitive interface. 
  • Powerful predictive analytics capabilities and strong data visualization features.
  • Strong automation. Includes more than 200 data connectors and 180 open-source Python scripts.
  • Open platform deployed through Kubernetes makes it possible to use models everywhere, including virtual machines, Snowflake and IoT devices.
  • Ranked as a “visionary” by Gartner in its 2021 Magic Quadrant for Data Science and Machine Learning Platforms.

Cons

  • Data access and data preparation features aren’t as robust as some competitors.
  • Some users complain about the lack of documentation and support resources.
  • Difficult to build models from scratch.
  • Can be challenging to tweak machine learning algorithms.

IBM Watson Studio

IBM’s focus is on building, managing and deploying data models through an AI-centric approach. The cloud-based platform is designed for data scientists, developers and analysts. It is built on open source technologies such as PyTorch, TensorFlow and scikit-learn—with connections to numerous code-based and visual data science tools from IBM.

Pros

  • Suitable for use by a wide range of users, from data scientists to business analysts.
  • Flexible modular design.
  • Strong data exploration and visualization features.
  • Focus on responsible AI.
  • Ranked as a “Leader” in Gartner’s 2021 Magic Quadrant for Data Science and Machine Learning Platforms.

Cons

  • Some users complain the program is slow to load at times.
  • User Interface and navigation can be confusing, especially for those who are non-technical.
  • Expensive.
  • Complaints about inadequate documentation and support materials.

KNIME Analytics Platform

Big data and predictive analytics are at the center of the vendor’s data science platform. The cloud-based solution is designed for authoring data science machine learning workflows and projects. The open source platform includes more than 4,000 nodes for connecting to various types of data sources, and transforming them into actionable models.

Pros

  • Supports an extensive array of DSML tasks and builds strong workflows. 
  • Intuitive interface.
  • Powerful data connection and ingestion capabilities, including support for most major file types and data sources.
  • Ranked as a “visionary” in Gartner’s 2021 Magic Quadrant for Data Science and Machine Learning Platforms.

Cons

  • Data visualization features not as robust and developed as many competitors.
  • Users report a sometimes steep learning curve.
  • Limited customer support for enterprise deployments.
  • Some users complain about a lack of flexibility.

MathWorks Matlab

This data science platform, from MathWorks, is designed to develop, integrate and deploy advanced AI and ML models at scale. It serves as a programming environment for algorithm development and data analysis. It includes powerful data visualization, modeling and simulation capabilities—as well as tools for building apps and other resources. 

Pros

  • Powerful deep learning, machine learning and predictive maintenance capabilities—including in areas such as robotics and signal processing.
  • Highly flexible framework that supports distributed environments ranging from the data to the cloud and edge.
  • Verifiable and reliable machine learning, which is used by organizations that need ultra-safe and secure deployments.
  • Ranked as a “leader” in Gartner’s 2021 Magic Quadrant for Data Science and Machine Learning Platforms.

Cons

  • Too complex for most citizen data scientists. Best for engineers and dedicated data scientists. 
  • No cloud or SaaS version. Available only as a desktop version for Windows, Mac and Linux.
  • No free trial and no premium consulting or integration services available from the vendor.
  • Can perform slowly with large datasets. 

Microsoft’s Azure Machine-Learning Studio

The end-to-end data science and analytics platform offers a low-code and no-code framework for developing, training and deploying data models. It accommodates classical models as well as machine learning and deep learning. It integrates with numerous other Azure cloud components and services, as well as outside data sources. 

Pros

  • Delivers a broad and powerful portfolio of features, tools and components for data science.
  • Suitable for use by data scientists and business users. 
  • Provides flexible notebook and SDK options for expert data scientists.
  • Offers an open framework with a strong network of partners, including other analytics providers that connect to Azure.
  • Ranked as a “visionary” in Gartner’s 2021 Magic Quadrant for Data Science and Machine Learning Platforms.

Cons

  • Requires a strong understanding of Azure and its associated ecosystem of modules and services.
  • Can be difficult to use for organizations requiring hybrid and multi-cloud data science environments.
  • Users rate ease of use lower than other data science solutions.
  • Limited support for third party tools and programming.
  • Large data sets sometimes run slow.

RapidMiner Studio

The vendor’s platform offers broad and rich tools for both data scientists and business users, within a visual workflow design framework. It includes more than 1,500 native algorithms, data prep and data science functions, with support for third party libraries. RapidMiner Studio also includes strong support for notebooks and programming languages such as Python and R.  

Pros

  • Connects with virtually any data source through a point and click interface.
  • Accommodates automated in-database processing for retrieving data without the need to write complex SQL.
  • Strong data visualization and exploration capabilities.
  • Collaboration features extend across multiple roles and personas.
  • Strong security features, including single sign-on.
  • Rated as a “leader” in The Forrester Wave: Multimodal Predictive Analytics & Machine Learning Solutions for 2020.

Cons

  • Receives relatively low marks among users for model publishing flexibility.
  • Some users complain about a difficult to use and inflexible interface.
  • A free edition provides limited features and capabilities. Other versions are pricey.
  • Some complaints from users about outdated looking visual output, including charts, graphs, animations and video.

SAS Visual Analytics

The vendor, a longstanding leader in data science, offers an enterprise platform focused heavily on analytics visualizations, composite AI, MLOps and decision intelligence. It supports virtually all major data sources and types, has customizable dashboards with templates, and includes robust publishing features with numerous pre-built visualization formats. 

Pros

  • Especially strong in predictive analytics, pattern recognition and machine learning.
  • SAS has established a partnership with Microsoft to support tight integration with Azure and Machine-Learning Studio.
  • Dedicated iOS and Android apps and responsive design for mobile web access. 
  • Excellent scalability with support for large numbers of users.
  • Ranked a “leader” in Gartner’s 2021 Magic Quadrant for Data Science and Machine Learning Platforms.

Cons

  • Installation and configuration can be difficult.
  • Lags behind other solutions for ease of use.
  • Limited open source support.
  • Some users complain that the user interface is somewhat drab and dated, and the platform can be difficult to learn.
  • Expensive.

Tibco Spotfire

The data visualization platform generates insights through NLQ powered search, AI-driven recommendations, and direct manipulation. It includes immersive dashboards and advanced analytics support for predictive analytics, geolocation analytics, and streaming analytics. The cloud-based platform is designed for both dedicated data scientists and other users.

Pros

  • Includes more than 60 native connectors to major data sources, along with custom connections via rich APIs.
  • Offers AI-driven recommendations and natural language search that simplify things for non-technical users.
  • Enables powerful collaboration among multiple personas and user groups.
  • Dedicated iOS and Android apps, along with responsive design for mobile browsers.

Cons

  • Citizen data scientist features and support lag behind other vendors.
  • Some users complain that the platform needs a more user-friendly interface.
  • Limited customization and scripting features can make more advanced modeling and data visualization difficult.
  • Some users complain that data loading and system performance can be slow.

See more: Top Data Visualization Tools for 2021

Top Data Science Software Comparison Chart

Data Science Software Pros Cons
Alteryx Designer
  • Powerful features and easy to use
  • Integrates well with data sources and software
  • Strong no code features
  • High customer ratings
  • Not highly customizable
  • Limited support for mobile devices
  • Workflows can be complex
  • Desktop version puts a heavy demand on hardware
Dataiku DSS
  • No-code tools are ideal for non-data scientists
  • Supports diverse business cases and metrics
  • Ranked as a “leader” by Gartner
  • High reliance on extension and plugins
  • Limited support for mobile devices
  • Limited configurability
  • Can be pricy
H2O.ai
  • Intuitive interface
  • Power predictive analytics and visualization features
  • Excellent automation
  • Open platform
  • Ranked as a “visionary” by Gartner
  • Lacks some data access and prep features
  • Documentation is sometimes lacking
  • Difficult to build models from scratch
  • Difficult to tweak machine learning algorithms
IBM Watson Studio
  • Powerful features
  • Suitable for use by non-data scientists
  • Strong data exploration and visualization
  • Focus on responsible AI
  • Ranked as a “leader” by Gartner
  • Can perform slowly with large data sets
  • User interface can be daunting
  • Expensive
  • Some user complaints about the lack of documentation and support materials
KNIME Analytics Platform
  • Supports numerous tasks and workflows
  • Intuitive interface
  • Powerful data connection and ingestion 
  • Ranked as a “visionary” by Gartner
  • Lags behind others in data visualization
  • Steep learning curve
  • Limited customer support
  • Users say the solution lacks flexibility
MathWorks Matlab
  • Strong deep learning, ML and predictive maintenance
  • Very flexible
  • Ideal for situations where reliable and accurate results are critical
  • Ranked as a “leader” by Gartner
  • Too complex for citizen data scientists
  • No cloud or SaaS version
  • Can perform slowly with large datasets
  • Limited vendor support
Microsoft Azure Machine-Learning Studio
  • Feature rich
  • Suitable for both data scientists and others
  • Highly flexible notebooks and SDK
  • Open framework
  • Ranked as a “visionary” by Gartner
  • Requires a strong understanding of the Azure ecosystem
  • Not well suited to hybrid and multi-cloud environments
  • Can be difficult to use
  • Limited third party connections
  • Large data sets sometimes run slow
RapidMiner Studio
  • Excellent connectivity with data sources
  • Strong visualization and exploration
  • Powerful collaboration features
  • Ranked as a “leader” by Gartner
  • Model publishing lags behind competitors
  • Interface can be confusing and inflexible
  • Expensive
  • User complaints about outdated visuals
SAS Visual Analytics
  • Powerful predictive analytics, pattern recognition and ML
  • Close partnership with Microsoft and Azure
  • Excellent mobile support
  • Highly scalable
  • Ranked as a “leader” by Gartner
  • Installation and configuration can be difficult
  • Can be difficult to use
  • Limited open source support
  • Expensive
  • Users complain that the interface is drab and dated
Tibco Spotfire
  • Excellent connectivity with data sources
  • Provides natural language support and AI driven recommendations and guidance
  • Strong collaboration
  • Excellent mobile support
  • Better suited for data scientists
  • Some users complain that the interface isn’t user friendly
  • Limited scripting and customization
  • Data loading and performance can be slow

 

]]>
Best Data Visualization Tools & Software https://www.datamation.com/big-data/data-visualization-tools/ Sat, 05 Jun 2021 03:44:27 +0000 https://www.datamation.com/?p=21312 The amount of data generated and consumed by organizations is growing at an astounding rate. The total volume of data and information worldwide has risen from approximately 2 zettabytes in 2010 to 74 in 2021, according to online data service Statistica. By 2024 the figure is expected to hit 149 zettabytes.

As organizations find themselves awash in data, there’s a growing need to make it digestible, understandable and actionable. Data visualization software takes direct aim at the task by creating static or moving images that communicate concepts in a way that words and numbers alone cannot. 

What Is a Data Visualization Tool?

Data visualization Tools find and display key data insights. The practice involves pulling data from a database and creating dashboards and graphics such as pie charts, bar charts, scatter plots, polar area diagrams, heat maps, timelines, ring charts, matrix charts, word clouds and more. By representing myriad data points graphically it’s possible to peer deeper into important numbers, trends, metrics and key performance indicators (KPIs). 

Not surprisingly, data visualization tools have moved out from the shadow of data scientists and IT departments and into the hands of business users. Organizations are now using visualization software to better understand scenarios as varied as customer sentiment and behavior, real-time sales, health care trends, departmental goals and market research, to name a few. In addition, advertisers and media organizations use these tools to generate eye-catching graphics and infographics.

Of course, different data visualization software approach the task differently. Some lean toward more conventional business intelligence (BI) functions while others plug in live data from social media and various applications residing in the enterprise. Some of these tools also incorporate machine learning and AI to deliver more advanced functionality — and insights. Most packages include templates and connectors for building robust models, graphics and dashboards. 

If you’re in the market for the best data visualization software, take the time to understand what various vendors and applications offer, how they work and whether they’re able to accommodate your organization’s data visualization needs — and pricing preferences. 

How to Select the Best Data Visualization Software

It’s important to focus on several factors when selecting a data visualization tool.

  • What type of visualizations do you require? Not surprisingly, different data visualization tools provide different ways to aggregate and view data. Make sure you can easily connect to and input the data you require. Most of these packages come with a robust set of APIs for ingesting data.
  • What types of platform does the software run on and what devices does it support? Some solutions are cloud-based and others reside on desktop and even mobile devices. Some vendors that support an on-premises model have applications that run only on Windows. This can present problems if you have teams using Macs. 
  • Does the package adequately support your organization’s performance requirements? Some applications encounter difficulties with extremely large files, and some don’t perform well in certain situations. If the rendering engine can’t support the speed required for web pages and real-time dashboards, you may have a problem. 
  • Does the application integrate with your workflows? Flexibility and scalability are often crucial. You may need to change templates, inputs or criteria from time to time — including through other programs and platforms connected through APIs. Make sure the program can support these changes.
  •  What does vendor support look like? An application may produce stunning visualizations. Yet building these visualizations can be extraordinarily difficult. Make sure a vendor offers solid documentation and support, including videos and tutorials. Also, check on whether the vendor offers 24×7 phone support if you get bogged down. 
  • What does the package cost? Some solutions, such as Google Data Studio, are free. Of course, they may not deliver the features you need — or they may lock you into a specific cloud provider. Most vendors offer tiered pricing, including an enterprise option. Review the choices carefully.
  • What security protections does it offer? Make sure that a platform provides adequate protections for accessing, securing and sharing data. 

10 Top Data Visualization Tools & Software

Jump to:

See more: What is Data Visualization?

Databox

Databox LogoThe cloud-based business analytics platform pulls data from a wide variety of sources to generate data visualizations in real-time. The list includes Google Analytics, Salesforce, HubSpot, Facebook, Mixpanel and Shopify. Databox offers more than 200 built-in dashboard templates, a robust set of APIs, metrics calculators and mobile apps for viewing data visualizations. The vendor offers a tiered pricing model.

Pros

  • Offers innovative features, including looped data boards, scheduled snapshots and annotations
  • Provides more than 70 one-click integrations with data services
  • Offers more than 200 pre-built reports
  • Intuitive interface and highly flexible visualizations

Cons

  • Some users complain about subpar integrations leading to inaccurate data and visualizations
  • Reports aren’t highly customizable
  • Some users complain about frequent bugs and crashes

Google Data Studio

Google Data Studio LogoGoogle Data Studio incorporates interactive dashboards and automated reporting. The cloud-based platform imports data from multiple sources, including Google Analytics, Google Ads and spreadsheets, and it integrates with more than 150 other cloud, SQL, ecommerce, and digital advertising platforms. Google Data Studio supports a wide array of data visualizations, including time series, bar charts, pie charts, tables, heat maps, geo maps, scorecards, scatter charts, bullet charts, and area charts.

Pros

  • Free
  • Drag-and-drop interface doesn’t require coding skills or heavy technical knowledge
  • Offers strong collaboration features and the ability to share dashboards
  • Built in tool for calculating metrics and formulas
  • Highly customizable

Cons

  • Can be difficult to integrate with non-Google platforms
  • Some users complain that the reporting functions are confusing and difficult to use
  • Some complaints about frequent bugs and crashes
  • Users complain that customer support is subpar

iDashboards

iDashboards LogoThe application strives for real-time operational intelligence through rich visualization capabilities. It combines data from upwards of 160 sources, offers hundreds of chart and design options, and builds dashboards that work on nearly any device. It also can use real-time data feeds to embed graphics and dashboard visualizations. This makes it possible to build dashboards for different organizational roles while supporting websites and mobile apps.

Pros

  • Straightforward and easy-to-use drag-and-drop interface
  • Pulls data from almost any source. Comes with nearly 300 connectors, including all major cloud and application platforms
  • Generates extremely rich data visualizations
  • Highly flexible and customizable
  • Pricing is attractive, particular for SMBs

Cons

  • Can be difficult to set up and configure
  • The large number of design options can be daunting to new users
  • Some users have problems connecting to or importing very large source files
  • Some premium features require additional licensing and costs

Infogram

Infogram LogoInfogram is a cloud-based marketing and media tool that supports more than 35 types of interactive data visualization formats. These include infographics, reports, dashboards, maps, charts and social media assets, such as Facebook, LinkedIn and Pinterest. It provides a drag-and-drop interface, real-time collaboration and the ability to publish online. There’s a basic free version as well as four other tiers for creatives, SMBs and large enterprises.

Pros

  • Offers a large and varied collection of designer templates, including interactive charts, maps and animations
  • Intuitive and easy to use interface
  • Integrates well with Google Drive, OneDrive and Dropbox
  • Powerful and elegant collaboration features for teams

Cons

  • The free plan doesn’t allow customizations and file downloads to systems and devices
  • More advanced features and plans can be pricey
  • Some users report bugs and crashes
  • No ability to work on projects offline

Qlik Sense

Qlik LogoThe vendor offers a self-service data analytics platform that’s designed for a broad array of users, including executives, decision-makers and analysts. The on-premise or cloud software provides drag-and-drop functionality and it connects to numerous data sources, including Snowflake and other leading products. Qlik Sense generates a varied array of data visualizations through interactive dashboards, and the application includes an open API and toolsets.

Pros

  • Delivers powerful features and tools for building complex data visualizations from nearly any data source or set
  • Offers an AI-based Smart Search feature that helps users uncover data relationships
  • Uses machine learning and AI for enhanced insights
  • Includes real-time analytics and data visualization
  • Excellent mobile device functionality

Cons

  • Learning curve can be steep
  • Requires some technical knowledge to use the software effectively
  • Users report that customizations can be challenging
  • Can be expensive, especially with add-ons

Sisense

Sisense LogoThe AI-powered analytics platform uses a robust set of APIs to generate data visualizations and actionable analytics. Available both in the cloud or on-premise, Sisense is highly customizable, and it includes data connectors for most major services, including Snowflake, Salesforce, Adobe Analytics, Amazon S3, Dropbox, Facebook and numerous Microsoft applications. It’s suitable for use by non-data scientists and line-of-business users.

Pros

  • Delivers powerful features along with fast and rich visualizations
  • Intuitive user interface
  • Customizable and flexible
  • Uses natural language and other AI to generate reports and visualizations
  • Highly rated customer support

Cons

  • Some reports of performance slowing with heavy data loads
  • May require knowledge of coding, including JavaScript and CSS to format visualizations
  • Some users complain that documentation is lacking, particularly surrounding widgets
  • Documentation can be difficult to understand.

Tableau

Tableau LogoThe popular business intelligence platform works with a broad array of data sources and services, from spreadsheets and conventional databases to Hadoop and cloud data repositories. It features smart dashboards and a highly interactive interface that lets users drag and drop elements, manipulate and combine data and views, and display data in numerous formats. Tableau includes robust sharing features. 

Pros

  • Fast and powerful
  • Well-designed interface
  • Consistently ranked as a leader by Gartner and others
  • Supports all major platforms and works on almost any device
  • Connects to hundreds of data sources and supports all major data formats.

Cons

  • Expensive
  • Mixed reviews about customer support
  • May require training to use the full set of features and capabilities on the platform
  • Difficult to customize
  • Lacks some important security controls.

Visme

Visme LogoVisme is focused on creating visual brand experiences and other content, including flyers, emails, reports, e-books, embedded videos, animations, and social media graphics. It incorporates a drag-and-drop interface and pulls data from numerous sources to generate illustrations, infographics, presentations and more. Visme offers a basic free service and tiered plans. 

Pros

  • Offers thousands of templates for infographics, presentations, charts, maps, documents and more
  • Integrates with Slack, YouTube, Vimeo, Dropbox, Google Drive, SurveyMonkey, Mailchimp, Google Maps and many other products and services
  • Provides strong collaboration features
  • Offers excellent tutorials and other learning materials
  • Highly rated customer support

Cons

  • Some users complain that the same graphics appear frequently at different companies and websites
  • Can be challenging to learn
  • Users say that the interface can at times be slow and confusing
  • Some complaints about frequent bugs
  • Only more expensive plans have key privacy settings

Whatagraph 

Whatagraph LogoWhatagraph is designed to handle performance monitoring and reporting. Marketing professionals use it to visualize data and build cross-channel reports. The application offers a variety of pre-designed templates and widgets, and it offers APIs for connecting numerous data sources, including Google Analytics, Facebook, LinkedIn, YouTube, HubSpot, Amazon Advertising and more. 

Pros

  • Excellent features and support for social media and marketing
  • Built in integrations for more than 30 data sources
  • Powerful cross-channel data integration and monitoring
  • Automated features for sending reports

Cons

  • Not highly customizable
  • Cross-channel integrations can be complex and require considerable time to set up
  • Some user complaints about the speed of the application
  • Some complaints about subpar customer support

Zoho Analytics

Zoho Analytics LogoThe self-service BI and data analytics software is designed to ingest large volumes of raw data and transform it into actionable visuals and reports, via dashboards. It is available in both on-premise and cloud versions. The platform can pull data from numerous sources, including Google Analytics, Mailchimp, YouTube, Salesforce and Twitter. It offers a tiered pricing model.

Pros

  • Comes with more than 500 data connectors
  • Includes strong collaborative features with security protections
  • Includes AI-based augmented analytics that let users create data visualizations using natural language.

Cons

  • Some users say that the interface is not user-friendly and as intuitive as they would like
  • The application can be slow to generate data visualization with very large data sets
  • Features and support for mobile platforms and devices sometimes lacking
  • Some users complain that the application lacks flexibility, particularly in regard to changing reports

See more: Best Data Quality Tools & Software 2021

Comparison Table of the Best Data Visualization Tools

Data Visualization Software Pros Cons
Databox • Innovative features
• One-click integration with 70+ data services
Extensive reporting formats·     Intuitive interface
• Integrations don’t always work well
• Reports aren’t highly customizable
• Some complains about bugs and crashes
Google Data Studio • Free
• Intuitive drag-and-drop interface
• Strong collaboration features
• Highly customizable
Difficult to use outside Google ecosystem
Reporting can be confusing
• Subpar customer support
iDashboards • Intuitive drag-and-drop interface
• Connectors for almost all major data sources
• Produces rich visualizations
Highly flexible
• Can be difficult to set up and configure
• Large number of design options can be daunting
• Can be difficult to import very large files
Infogram • Large and varied collection of templates
• Intuitive and easy to use interface
• Integrates well with Google Drive, OneDrive and Dropbox
• Strong collaboration features
• Free plan is extremely limited
Reports of frequent bugs and crashes
• It’s not possible to work on projects offline
Qlik Sense • Powerful features
• Supports a very wide range of data sources
• Includes machine learning and AI capabilities
• Works well on mobile devices
• Steep learning curve
• Requires some technical knowledge to build
effective visualizations
• Not easily customizable
• Can be pricy with add-ons
Sisense • Powerful features and rich visualizations
Intuitive user interface
• Flexible and customizable
• Incorporate natural language and other AI functions
• High customer support ratings
• Can exhibit slow performance for very large data loads
May require scripting for more advanced visualizations
Some complaints about documentation materials
Tableau • Fast and extremely powerful
• Intuitive interface
• Connects to most major data sources
• Supports most platforms and devices
• Expensive
• Difficult to customize
• Mixed user reviews about customer support
• Some security controls missing
Visme • Offers numerous templates
• Integrates with most major applications and data sources
• Strong collaboration
• Highly rated customer support
• Users complain they see the same graphics at different websites
• Can be challenging to learn the program
• Interface can be slow at times
• Some complaints about bugs
Whatagraph • Shines for marketing and social media
• Powerful cross-channel integration and monitoring
• Automated reporting features
• Not highly customizable
• Setting up integrations can be difficult and
time consuming
• Some user complaints about customer support
Zolo Analytics • More than 500 data connectors
• Strong collaboration with built in security
• Offers AI and natural language features
• User interface could be more user-friendly
• Can be slow when accessing very large data sets
• Lacks flexibility for some users

 

]]>
Best Threat Intelligence Platforms https://www.datamation.com/security/threat-intelligence/ Fri, 21 May 2021 23:11:38 +0000 https://www.datamation.com/?p=21262 Staying in front of security threats is an increasingly difficult proposition. Despite a mind-boggling array of sophisticated tools, solutions and systems, the risks continue to grow. 

That’s where threat intelligence enters the picture. It attempts to step beyond traditional antivirus and other malware protection and offer insights and protection proactively. As zero-day attacks and polymorphic malware flourish, these systems aim to ratchet up detection and protection, typically through data analytics and machine learning.

Threat intelligence platforms (TIPs) aggregate, ingest and organize data from a number of sources — including internal logs and external feeds — to spot risks early. They uses APIs, bots and other methods to examine data, such as IP addresses, website content, server names and characteristics and SSL certificates. Many platforms also rely on anonymous open source data sharing.

By examining patterns and various events and enriching the data, a TIP can spot unusual and threatening behaviors, tactics, techniques and procedures that can lead to an intrusion, data breach, ransomware or other cybersecurity problem. Many link to security information and event management (SIEM) solutions, endpoints, firewalls, APIs, intrusion prevention systems (IPSs) and other security components.  Many of the leading platforms also rely on human analysts to dig deeper.

As staff working in security operations centers (SOCs) attempt to gain the upper hand on security risks, bad actors and emerging attack vectors, many are tapping threat intelligence frameworks. The value of a TIP is that it helps teams prioritize risks and threats and automated security responses. Emergen Research reports that the global threat intelligence market will reach $20.28 billion by 2028. What’s more, many platforms are turning to AI and machine learning to improve real-time threat intelligence. 

Yet, all threat intelligence platforms aren’t created equal. It’s critical to understand what exactly a platform offers, how it works, what it costs and what the vendor’s roadmap is for the future. With millions of threat indicators appearing daily — and many of them increasingly sophisticated — organizations are recognizing that quick assessment and response is a critical element in preventing economic and reputational damage.

How to Select the Right Threat Intelligence Platform

A number of factors are important when choosing a threat intelligence platform. Among them:

  • What data does the platform include and what’s the source of this data? It’s important to know how and where the vendor is collecting data, including the original source, and how it processes data. This might include factors such as IP addresses and domain URLs, reputational scores, newly discovered security risks and known vulnerabilities.
  • What format is the data? Vendors typically offer data feeds in CSV, XML, STIX, PDF and JSON. Some provide APIs to accommodate web services. In addition, it’s important to understand how the data is packaged — or how it can be adapted. This may include reports, summaries and alerts, along with customized feeds for customers.
  • How does the vendor formulate reports and alerts? What methodologies does it use to combine and blend data feeds en route to developing advisories and alerts? Does it rely only on machine data or use trained analysts? What other ways does the vendor distinguish itself from its peers?
  • How often does the vendor update the intelligence data? Ideally, data connections are real-time or constantly updated throughout a day.
  • What’s the price for a subscription? Prices among vendors vary greatly, often based on the type of services an organization requires. Some TIP vendors offer tiered product offerings, including free or inexpensive basic versions. Typically, the cost for an organization is several thousand dollars per month.
  • What’s included in the package? It’s important to know what resources the vendor has for learning how to use the platform and whether it offers any training. It’s also essential to know what services and support the vendor provides. Is there a 24/7 helpline? Is it live phone support or email support? If it’s the latter, how soon does the vendor respond? 

10 top threat intelligence platforms

Jump to:

See more: IBM Begins Cloud Confidentiality Push

AlienVault USM

The unified security management (URM) solution, part of AT&T, provides threat detection, incident response and compliance management capabilities. It collects and analyzes data from across attack surfaces, aggregates risks and threats — and continually updates threat information. The solution is designed to work within an ecosystem of AlienApps, which enables organizations to orchestrate and automate actions, based on events.  

Pros

  • Robust cloud support, including automated AWS and Azure discovery
  • Offers pre-build templates along with highly customizable reports and dashboards
  • Highly automated
  • Offers forensic querying
  • High customer ratings

Cons

  • Can be difficult to configure and customize
  • Some users say the interface can be challenging
  • Some users complain about inadequate customer support

Anomali ThreatStream

Anomali offers a robust platform for threat intelligence. It consolidates threat management and automates detection of risks with a set of tools that collect, manage, integrate, investigate and share data within an organization and from outside. The platform is available for on-premises and cloud-native deployments and includes support for virtual machines and air-gapping.

Pros

  • Excellent user interface
  • A mature platform with a deep and broad set of features
  • Supports numerous data formats
  • First-rate reporting capabilities
  • High customer support ratings

Cons

  • Some users complain about the lack of flexibility and an inability to adequately customize the platform
  • Lacks some automated reporting features
  • Inability to fully integrate with SIEM systems and freely move data between various systems

CrowdStrike Falcon

The company has established itself as a leader in the TIP space. It offers next-generation endpoint protection by combining antivirus (AV), endpoint detection and response (EDR) and a 24/7 managed hunting service via a lightweight agent that’s installed on devices. CrowdStrike’s services include advanced threat intelligence reporting and access to intelligence analysts that tailor intelligence and responses to organization’s specific needs and requirements.

Pros

  • Large user base
  • Delivers high quality intelligence information using both machine and human analysis
  • Excellent and generally easy-to-use interface
  • Highly rated customer service and support
  • The lightweight agent doesn’t impact the performance and stability of systems

Cons

  • It’s a tiered service that can be pricey
  • Reporting functions aren’t as flexible as some users desire
  • Log management can be complex and confusing
  • Mac features lag behind Windows and Linux

FireEye Mandiant Threat Intelligence

The company has staked out a position as a pioneer and leader in the field. Its threat intelligence module is available as a software-as-services (SaaS) solution, and it combines both data analytics and human oversight to spot and thwart threats. FireEye includes a dashboard, machine intelligence functions and other tools to provide broad and deep real-time insights.

Pros

  • Delivers high-quality threat intelligence information due to both machine and human collection and analysis capabilities
  • Typically integrates well with other tools, such as SIEM
  • Offers a free version with limited features
  • Users give FireEye high ratings for customer support

Cons

  • Can require a high level of technical knowledge to interpret reports and use the platform effectively
  • Some users report that the platform generates too much technical data that’s not actionable

IBM X-Force Threat Intelligence Services

IBM offers an expansive platform for managing threat intelligence. At the center: the company’s blending of machine-readable real-time data and human oversight. IBM offers detailed intelligence reports on threat activity, malware, threat actor groups and industry assessments. Its enterprise intelligence management platform is designed to feed threat data to existing security systems within organizations.

Pros

  • Provides a high-quality and up-to-date view of threats collected from a wide array of sources
  • Forrester describes the “accuracy and specificity” of data as a core strength
  • Generates low false-positive rates

Cons

  • Some users complain that the interface could be more user friendly
  • Can be complex and difficult to use effectively
  • Intelligence information may be too general at times. Some users say the platform could provide more contextualized and precise information

IntSights External Threat Protection Suite

IntSights offers a threat intelligence platform that aggregates and enriches a diverse set of data sources. It includes a vulnerability risk analyzer and third party and dark web checker. The platform delivers information through a single dashboard, and it offers real-time context in order to prioritize risks and help organizations conduct investigations — and block threats.

Pros

  • Offers a well-designed and easy-to-use interface
  • Provides rich and varied data
  • Highly rated customer sales and support

Cons

  • Reporting features aren’t as flexible or robust as some users would like
  • Sometimes delivers too much unneeded data along with dated threat intelligence information
  • Limited information and insights into dark web activities and behaviors

Kaspersky Threat Intelligence Services

Although the company’s threat intelligence offering is only part of its overall focus on cybersecurity, the company is a leader in the threat intelligence space. It provides threat data feeds, threat lookups and digital footprint intelligence that can expose an organization’s weak spots. 

Pros

  • Provides high-quality threat data
  • The company is aggressively focused on adding third party-integrations and adding support for new data sources
  • Offers rich reporting capabilities

Cons

  • Users complain that the solution can be complex and at times difficult to use
  • Sometimes provides too much general or irrelevant data
  • The user community reports high false-positive rates
  • Lacks automation that other leading vendors provide in their TIP platforms

Mimecast Threat Intelligence 

With a focus on email security, Mimecast examines numerous data sources to detect attacks. The subscription-based cloud security service is designed to protect email systems from various types of threats, ranging from viruses to ransomware. This includes URL protection that identifies, blocks and rewrites malicious links in email. The threat intelligence platform also helps prevent users from accessing dangerous sites or downloading malicious content.

Pros

  • Highly scalable
  • URL protection methods are highly effective in thwarting phishing and malware
  • A security operations center continuously monitors and analyzes threats

Cons

  • A focus on email security means that an organization will likely require other threat intelligence solutions
  • Users complain that Mimecast provides minimal support for archived emails

Palo Alto Networks WildFire

Harnessing inline machine learning, bare metal analysis and dynamic and static analysis, WildFire delivers a threat intelligence platform designed for zero-day malware protection. The TIP blocks unknown and high-risk file types, scripts and other data by extracting pieces of files, analyzing them and conducting data analysis across hundreds of behavioral characteristics.

Pros

  • Incorporates machine learning
  • Uses a multi-layered approach to increase threat detection
  • Highly automated
  • Strong integration with SIEMs and other tools
  • Large user base of 35,000+ delivers excellent shared intelligence

Cons

  • Expensive compared to other platforms.
  • Can be difficult to set up, and it’s not easily customizable
  • Some users complain about the lack of customer support

Recorded Future

The vendor pulls and classifies data from “billions of entities” across languages and geographies to map relationships and spot threats. It combines advanced analytics and machine learning to discover, categorize and deliver real-time threat intelligence. Recorded Future also relies on a team of human analysts to guide data models and provide direction.

Pros

  • Delivers robust and extensive data collection capabilities and security intelligence
  • Highly flexible with different modules designed for specific needs and risks
  • Excellent interface
  • Strong search capabilities, including the ability to set up automated queries
  • Supports numerous types of threat intelligence, including brand, SecOps, threats, vulnerabilities, geopolitical and third party

Cons

  • Licensing model can be complex and expensive if a company uses multiple modules
  • Some users complain that the API is not as mature and robust as they would like
  • May require considerable training to use all the various features and capabilities

See more: Managed Security Services Provider Releases Integrated Cybersecurity Platform

Comparison Table of Threat Intelligence Platforms

Threat Intelligence Platform Pros Cons
AlienVault USM

·     Strong automation

·     Offers pre-build templates

·     Flexible

·     Features forensic querying

·     Can be difficult to configure and customize

·     Interface can be confusing

·     Users say customer support is sometimes
lacking

Anomali ThreatStream

·     Excellent user interface

·     Rich feature set

·     Support for numerous data formats

·     Strong reporting features

·     Can be difficult to customize

·     Missing some automated reporting features

·     Doesn’t always play well with SIEMs

CrowdStrike Falcon

·     Large user base

·     Provides high quality threat information

·     Excellent interface

·     Lightweight agent uses few system resources

·     Can be pricy

·     Some reporting functions lack flexibility

·     Log management can be confusing

·     Mac features lag behind Windows and Linux

FireEye Threat Intelligence

·     Provides high quality threat information

·     Integrates well with SIEMs and other tools

·     Excellent customer support

·     May require deep technical knowledge

·     Some users complain about receiving too much data

IBM X-Force

·     Extensive data collection capabilities

·     Provides high quality threat information

·     Produces low false-positive rates

·     Some users find the user interface confusing

·     May require deep technical knowledge

·     Information is sometimes too broad and non-specific

IntSights External Threat Protection Suite

·     First-rate user interface

·     Offers rich and varied threat information

·     Customer support is highly rated by users

·     Lacks some desirable reporting features

·     Delivers too much nonspecific information at times

·     Users say that some threat intelligence information is dated

Kaspersky Threat Intelligence Services

·     Provides high quality threat information

·     Vendor is aggressively adding features

·     Rich reporting capabilities

·     Some users say the platform is complex

·     Sometimes provides too much general data

·     High false-positive rates

·     Lacks some automation features

Mimecast Threat Intelligence

·     Highly scalable

·     Effective in preventing phishing attacks

·     Continually updates solution based on changing threat landscape

·     Effective only for email, thus the need for broader threat intelligence is necessary

·     Limited support for scanning archived emails

Palo Alto Networks ·     Highly automated, with a multi-layer detection
framework·     Strong SIEM support·     Large user base for threat intelligence information sharing

·     Can be expensive

·     Difficult to set up and customize

·     Some users complain about inadequate customer
support

Recorded Future

·     Robust and extensive data collection

·     Highly flexible

·     Excellent user interface

·     Provides broad threat intelligence 

·     Can be expensive

·     Some users complain about API support

·     Can be complicated and difficult to set up and use

 

]]>
Best Data Quality Tools & Software for 2023 https://www.datamation.com/big-data/data-quality-tools/ Thu, 15 Apr 2021 23:15:00 +0000 http://datamation.com/2019/06/20/10-top-data-quality-tools/ Data quality mangement is a critical issue in today’s data centers. The complexity of the Cloud continues to grow, leading to an increasing need for data quality tools that analyze, manage, and scrub data from numerous sources, including databases, email, social media, logs, and the Internet of Things (IoT).

TABLE OF CONTENTS
What Are Data Quality Tools?
How Are Data Quality Tools Used?
How To Select The Best Data Quality Tool
Top Data Quality Tools & Software
Comparison Chart of Data Quality Software

What Are Data Quality Tools?

Data quality tools clean data by removing formatting errors, typos, and redundancies, while ensuring that organizations apply rules, automate processes, and have logs that provide details about processes. Used effectively, these data quality tools can remove inconsistencies that drive up enterprise expenses and annoy customers and business partners. They also drive productivity gains and increase revenues.

How Are Data Quality Tools Used?

Data quality software helps data managers address four crucial areas of data management: data cleansing, data integration, master data management, and metadata management. These tools go beyond basic human analysis and typically identify errors and anomalies through the use of algorithms and lookup tables.

Over the years, these tools have become more sophisticated and automated, making them easier to use and more advanced. These simplified data quality tools now tackle numerous tasks, including validating contact information and mailing addresses, data mapping, data consolidation associated with extract, transform and load (ETL) tools, data validation reconciliation, sample testing, data analytics, and all forms of big data handling.

Also see: Top 15 Data Warehouse Tools

How To Select The Best Data Quality Tool

Identifying the right data quality management software is important for data managers who want to assess and improve the overall useability of their databases. Finding a superior data quality tool hinges on many key factors, including how and where an organization stores and uses data, how data flows across networks, and what type of data a team is attempting to tackle. 

Although basic data quality tools are available for free through open source frameworks, many of today’s solutions offer sophisticated capabilities that work with multiple platforms and database formats. It is important to understand what a particular data quality tool can do for your enterprise — and whether you may need multiple tools to address more complex scenarios.

Consider these three factors when choosing a data quality management platform to address your business needs:

1. Identify your data challenges.

Incorrect data, duplicate data, missing data, and other data integrity issues can significantly impact — and undermine — the success of a business initiative. A haphazard or scattershot approach to maintaining data integrity may result in wasted time and resources. It can also lead to subpar performance and frustrated employees and customers. To avoid frustrating internal and external responses to data challenges, it’s important to start by conducting an analysis of existing data sources, current tools in use, and problems and issues that occur. This offensive approach delivers insight into gaps and possible fixes.

2. Understand what data quality tools can and cannot do.

There’s no fix for completely broken, incomplete, or missing data. Data cleansing tools cannot perform magic on dated legacy systems or sloppy spreadsheets. If your organization identifies gaps and shortcomings in its data collection and management methods, it may be necessary to go back to the drawing board and examine the entire data framework. This includes the data management tools you’re currently using, how your organization manages and stores data, and what workflows and processes could be changed and improved.

3. Understand the strengths and weaknesses of various data cleansing tools.

It’s obvious that not all data quality management tools are created equal. Data cleansing tools offer different strengths and weaknesses: some are designed to enhance specific applications such as Salesforce or SAP, others excel at spotting errors in physical mailing addresses or email, and still others tackle IoT data or pull together disparate data types and formats, so you need to decide which features are most important to your organization. In your decision making process, it’s also important to understand how a data cleansing tool works and what level of automation it offers, as well as specific features that you will need to accomplish key tasks. Finally, it’s crucial to consider factors such as data controls/security and licensing costs.

Top Data Quality Tools & Software

Jump to:

Cloudingo

Cloudingo Logo

Value proposition for potential buyers: Cloudingo is a prominent data integrity and data cleansing tool designed for Salesforce. It tackles everything from deduplication and data migration, to spotting human errors and data inconsistencies. The platform handles data imports, delivers a high level of flexibility and control, and includes strong security protections.

Key values/differentiators:

  • The application uses a drag-and-drop graphical interface to eliminate coding and spreadsheets. It includes templates with filters that allow for customization, and it offers built in analytics. APIs support both representational state transfer (REST) and simple object access protocol (SOAP). This makes it possible to run the application from the cloud or from internal systems.
  • The data cleansing management tool handles all major requirements including merging duplicate records and converting leads to contacts, deduplicating import files, deleting stale records, automating tasks on a schedule, and providing detailed reporting functions about change tracking. It offers near real-time synchronization of data.
  • The application includes strong security controls that include permission-based logins and simultaneous logins. Cloudingo supports unique and separate user accounts and tools for auditing who has made changes.

Data Ladder

Data Ladder Logo

Value proposition for potential buyers: The vendor has established itself as a leader in data cleansing through a comprehensive set of tools that clean, match, dedupe, standardize and prepare data. Data Ladder is designed to integrate, link, and prepare data from nearly any source. It uses a visual interface and taps a variety of algorithms to identify phonetic, fuzzy, abbreviated, and domain-specific issues.

Key values/differentiators:

  • The company’s DataMatch Enterprise solution aims to deliver an accuracy rate of 96 percent for between 40K and 8M record samples, based on an independent analysis. It uses multi-threaded, in-memory processing to boost speed and accuracy, and it supports semantic matching for unstructured data.
  • Data Ladder supports integrations with a vast array of databases, file formats, big data lakes, enterprise applications, and social media. It provides templates and connectors for managing, combining, and cleansing data sources. This includes Microsoft Dynamics, Sage, Excel, Google Apps, Office 365, SAP, Azure Cosmos database, Amazon Athena, Salesforce, and dozens of others.
  • The data standardization features draw on more than 300,000 pre-built rules, while also allowing customizations. The system uses proprietary built-in pattern recognition, but it also lets organizations build their own RegEx-based patterns visually.

IBM InfoSphere QualityStage

IBM Logo

Value proposition for potential buyers: IBM’s data quality application, available on-premise or in the cloud, offers a broad yet comprehensive approach to data cleansing and data management. The focus is on establishing consistent and accurate views of customers, vendors, locations, and products. InfoSphere QualityStage is designed for big data, business intelligence, data warehousing, application migration, and master data management.

Key values/differentiators:

  • IBM offers a number of key features designed to produce high quality data. A deep data profiling tool delivers analysis to aid in understanding content, quality and structure of tables, files, and other formats. Machine learning can auto-tag data and identify potential issues.
  • The platform offers more than 200 built-in data quality rules that control the ingestion of bad data. The tool can route problems to the right person so that the underlying data problem can be addressed.
  • A data classification feature identifies personally identifiable information (PII) that includes taxpayer IDs, credit cards, phone numbers, and other data. This feature helps eliminate duplicate records or orphan data that can wind up in the wrong hands.
  • The platform supports strong governance and rule-based data handling. It includes strong security features.

Informatica Quality Data and Master Data Management

informatica LogoValue proposition for potential buyers: Informatica has adopted a framework that handles a wide array of tasks associated with data quality and Master Data Management (MDM). This includes role-based capabilities, exception management, artificial intelligence insights into issues, pre-built rules and accelerators, and a comprehensive set of data quality transformation tools.

Key values/differentiators:

  • Informatica’s Data Quality solution is adept at handling data standardization, validation, enrichment, deduplication, and consolidation. The vendor offers versions designed for cloud data residing in Microsoft Azure and AWS.
  • The vendor also offers a Master Data Management (MDM) application that addresses data integrity through matching and modeling, metadata and governance, and cleansing and enriching. Among other things, Informatica MDM automates data profiling, discovery, cleansing, standardizing, enriching, matching, and merging within a single central repository.
  • The MDM platform supports nearly all types of structured and unstructured data, including applications, legacy systems, product data, third party data, online data, interaction data, and IoT data.

OpenRefine

OpenRefine Logo

Value proposition for potential buyers: OpenRefine, formerly known as Google Refine, is a free open source tool for managing, manipulating, and cleansing data, including big data. The application can accommodate up to a few hundred thousand rows of data. It cleans, reformats and transforms diverse and disparate data. OpenRefine is available in several languages, including English, Chinese, Spanish, French, Italian, Japanese, and German.

Key values/differentiators:

  • GoogleRefine cleans and transforms data from a wide variety of sources, including standard applications, the web, and social media data.
  • The application provides powerful editing tools to remove formatting, filter data, rename data, add elements, and accomplish numerous other tasks. In addition, the application can interactively change large chunks of data in bulk to fit different requirements.
  • The ability to reconcile and match diverse data sets makes it possible to obtain, adapt, cleanse, and format data for web services, websites, and numerous database formats. In addition, GoogleRefine accommodates numerous extensions and plugins that work with many data sources and data formats.

SAS Data Management

SAS Logo

Value proposition for potential buyers: SAS Data Management is a role-based graphical environment designed to manage data integration and cleansing. It includes powerful tools for data governance and metadata management, ETL and ELT, migration and synchronization capabilities, a data loader for Hadoop, and a metadata bridge for handling big data. Gartner named SAS a “Leader” in its 2020 Magic Quadrant for Data Integration Tools.

Key values/differentiators:

  • SAS Data Management offers a powerful set of wizards that aid in the entire spectrum of data quality management. These include tools for data integration, process design, metadata management, data quality controls, ETL and ELT, data governance, migration and synchronization, and more.
  • Strong metadata management capabilities aid in maintaining accurate data. The application offers mapping, data lineage tools that validate information, wizard-driven metadata import and export, and column standardization capabilities that aid in data integrity.
  • Data cleansing takes place in native languages with specific language awareness and location awareness for 38 regions worldwide. The application supports reusable data quality business rules, and it embeds data quality into batch, near-time, and real-time processes.

Precisely Trillium

Precisely Trillium Logo

Value proposition for potential buyers: Precisely’s purchase of Trillium has positioned the company as a leader in the data integrity space. It offers five versions of the plug-and-play application: Trillium Quality for Dynamics, Trillium Quality for Big Data, Trillium DQ, Trillium Global Locator, and Trillium Cloud. All address different tasks within the overall objective of optimizing and integrating accurate data into enterprise systems.

Key values/differentiators:

  • Trillium Quality for Big Data cleanses and optimizes data lakes. It uses machine learning and advanced analytics to spot dirty and incomplete data, while delivering actionable business insights across disparate data sources.
  • Trillium DQ works across applications to identify and fix data problems. The application, which can be deployed on-premises or in the cloud, supports more than 230 countries, regions and territories. It integrates with numerous architectures, including Hadoop, Spark, SAP, and Microsoft Dynamics.
  • Trillium DQ can find missing, duplicate, and inaccurate records, but also uncover relationships within households, businesses, and accounts. It includes an ability to add missing postal information as well as latitude and longitude data, as well as other key types of reference data.
  • Trillium Cloud focuses on data quality for public, private, and hybrid cloud platforms and applications. This includes cleansing, matching, and unifying data across multiple data sources and data domains.

Talend Data Quality

Talend Logo

Value proposition for potential buyers: Talend focuses on producing and maintaining clean and reliable data through a sophisticated framework that includes machine learning, pre-built connectors and components, data governance and management, and monitoring tools. The platform addresses data deduplication, validation, and standardization. It supports both on-premises and cloud-based applications while protecting PII and other sensitive data. Gartner rated the firm a “Leader” in its 2020 Magic Quadrant for Data Integration Tools.

Key values/differentiators:

  • The data integrity application uses a graphical interface and drill down capabilities to display details about data integrity. It allows users to evaluate data quality against custom-designed thresholds and measure performance against internal or external metrics and standards.
  • The application enforces automatic data quality error resolution through enrichment, harmonization, fuzzy matching, and deduplication.
  • Talend offers four versions of its data quality software. These include two open-source versions with basic tools and features, and a more advanced subscription-based model that includes robust data mapping, reusable “joblets,” wizards, and interactive data viewers. More advanced cleansing and semantic discovery tools are available only with the company’s paid Data Management Platform.

TIBCO Clarity

TIBCO Clarity Logo

Value proposition for potential buyers: TIBCO Clarity places a heavy emphasis on analyzing and cleansing large volumes of data to produce rich and accurate data sets. The application is available in on-premises and cloud versions. It includes tools for profiling, validating, standardizing, transforming, deduplicating, cleansing, and visualizing for all major data sources and file types.

Key values/differentiators:

  • Clarity offers a powerful deduplication engine that supports pattern-based searches to find duplicate records and data. The search engine is highly customizable; it allows users to deploy match strategies based on a wide array of criteria, including columns, thesaurus tables, and other criteria like multiple languages. It also lets users run deduplication against a dataset or an external master table.
  • A faceting function allows users to analyze and regroup data according to numerous criteria, including by star, flag, empty rows, and text patterns. This simplifies data cleanup while providing a high level of flexibility.
  • The application supports strong editing functions that let users manage columns, cells, and tables. It supports splitting and managing cells, blanking and filling cells, and clustering cells.
  • The address cleansing function works with TIBCO GeoAnalytics as well as Google Maps and ArcGIS.

Validity DemandTools

Validity LogoValue proposition for potential buyers: Validity, the maker of DemandTools, delivers a robust collection of tools designed to manage CRM data within Salesforce. The product accommodates large data sets and identifies and deduplicates data within any database table. It can perform multi-table mass manipulations and standardize Salesforce objects and data. The application is flexible and highly customizable, and it includes powerful automation tools.

Key values/differentiators:

  • The vendor focuses on providing a comprehensive suite of data integrity tools for Salesforce administrators. DemandTools compares a variety of internal and external data sources to deduplicate, merge, and maintain data accuracy.
  • DemandTools offers many powerful features, including the ability to reassign ownership of data. In addition, a Find/Report module allows users to pull external data, such as an Excel spreadsheet or Access database, into the application and compare it to any data residing inside a Salesforce object.
  • The Validity JobBuilder tool automates data cleansing and maintenance tasks by merging duplicates, backing up data, and handling updates according to preset rules and conditions.

Comparison Chart of Data Quality Software

VendorToolsFocusKey Features
CloudingoCloudingoSalesforce data Deduplication; data migration management; spots human and other errors/inconsistencies
Data LadderDataMatch Enterprise; ProductMatchDiverse data sets across numerous applications and formatsIncludes more than 300,000 prebuilt rules; templates and connectors for most major applications
IBMInfoSphere QualityStageBig data, business intelligence; data warehousing; application migration and master data managementIncludes more than 200 built-in data quality rules; strong machine learning and governance tools
InformaticaData Quality; Master Data ManagementAccommodates diverse data sets; supports Azure and AWSData standardization, validation, enrichment, deduplication, and consolidation
OpenRefineOpenRefineTransforms, cleanses and formats data for analytics and other purposesPowerful capture and editing functions
SASData Management
Managing data integration and cleansing for diverse data sources and sets
Strong metadata management; supports 38 languages
PreciselyTrillium Quality for Dynamics; Trillium Quality for Big Data;
Trillium Quality for DQ;
Trillium Global Locator;
Trillium Cloud
Cleansing, optimizing and integrating data from numerous sourcesDQ supports more than 230 countries, regions and territories; works with major architectures, including Hadoop, Spark, SAP and MS Dynamics
TalendData QualityData integrationDeduplication, validation and standardization using machine learning; templates and reusable elements to aid in data cleansing
TIBCOClarity High volume data analysis and cleansingTools for profiling, validating, standardizing, transforming, deduplicating, cleansing and visualizing for all major data sources and file types
ValidityDemandToolsSalesforce dataHandles multi-table mass manipulations and standardizes Salesforce objects and data through deduplication and other capabilities
]]>
How Does Edge Computing Work & What Are the Benefits? https://www.datamation.com/edge-computing/edge-computing/ Fri, 09 Apr 2021 22:55:31 +0000 https://www.datamation.com/?p=20939 Edge computing is a broad term that refers to a highly distributed computing framework that moves compute and storage resources closer to the exact point they’re needed—so they’re available at the moment they’re needed. Edge computing companies provide solutions that reduces latency, speeds processing, optimizes bandwidth and introduces entirely different features and capabilities that aren’t possible with data centers.

The ability to process data analytics on an edge network enables features and capabilities that are crucial for advanced digital frameworks, including the Fourth Industrial Revolution. This includes IoT software, highly integrated supply chains, machine learning, artificial intelligence (AI), mobile connectivity, virtual reality and augmented reality, digital twins, robotics, 3D fabrication, medical devices, autonomous vehicles, connected video cameras, smart home automation and much more.

Although conventional servers, storage, and cloud computing continue to play a key role in computing, edge technologies are radically redefining business and life. By moving data processing at or near the source of data generation, edge devices become smarter and they’re able to handle tasks that would have been unimaginable only a few years ago. This data fuels real-time insights and applications ranging from sleep tracking and ridesharing to the condition of a drilling bit on an oil rig.

Digital business increasingly depends on handling tasks at the point where a device or person resides. The ability to construct a distributed computing model and harness localized computing power is at the foundation of the Internet of Things (IoT) and today’s advanced digital technologies.

Edge computing fundamentally rewires and revamps the way organizations generate, manage and consume data. Gartner estimates that by 2025, 75% of data will be created and processed outside the traditional data center or cloud. 

How Does Edge Computing Work?

Edge computing works by capturing and processing information as close to the source of the data or desired event as possible. It relies on sensors, computing devices and machinery to collect data and feed it to edge servers or the cloud. Depending on the desired task and outcome, this data might feed analytics and machine learning systems, deliver automation capabilities or offer visibility into the current state of a device, system or product.

Today, most data calculations take place in the cloud or at a datacenter. However, as organizations migrate to an edge model with IoT devices, there’s a need to deploy edge servers, gateway devices and other gear that reduce the time and distance required for computing tasks—and connect the entire infrastructure. Part of this infrastructure may include smaller edge data centers located in secondary cities or even rural areas, or cloud containers that can easily be moved across clouds and systems, as needed.

Yet edge data centers aren’t the only way to process data. In some cases, IoT devices might process data onboard, or send the data to a smartphone, an edge server or storage device to handle calculations. In fact, a variety of technologies can make up an edge network. These include mobile edge computing that works over wireless channels; fog computing that incorporates infrastructure that uses clouds and other storage to place data in the most desirable location; and so-called cloudlets that serve as ultra-small data centers.

An edge framework introduces flexibility, agility and scalability that’s required for a growing array of business use cases. For example, a sensor might provide real-time updates about the temperature a vaccine is stored at and whether it has been kept at a required temperature throughout transport.

Sensors and edge IoT devices can track traffic patterns and provide real-time insights into congestion and routing. And motion sensors can incorporate AI algorithms that detect when an earthquake has occurred to provide an early warning that allows businesses and homes to shut off gas supplies and other systems that could result in a fire or explosion.

What is an Edge Device?

Within an edge network, systems that capture and transmit data serve as edge devices. This can include standalone devices as well as gateways that connect to devices downstream and interface with systems upstream. However, the concept increasingly revolves around IoT sensors and devices that reside at the edge. These systems can incorporate an array of sensing capabilities, including light, sound, magnetic fields, motion, moisture, tactile capabilities, gravity, electrical fields and chemicals. They may process data using apps or via on-board computing capabilities, and they often include batteries.

Edge and IoT devices can tap a variety of communications protocols, including

  • Bluetooth Low Energy (BLE)
  • RFID
  • Wi-Fi
  • Zigbee
  • Z-Wave
  • Cellular (including 5G)
  • NFC
  • Ethernet

Edge IoT devices typically send data over an open systems interconnection (OSI) framework that unites disparate devices and standards. These systems also connect with cloud and Internet protocols such as AMQP, MQTT, CoAP, and HTTP. Typically, the framework relies on a specialized edge device, such as a smart gateway, to route and transfer data and manage all the connections.

The microprocessors used in IoT devices continue to advance. Not only are some chips able to accommodate onboard processing—including AI and machine learning functions—they’re becoming smarter and more energy efficient. Some can wake on demand and include hard-wired capabilities. 5G chips are also changing the IoT and the edge by imparting devices with faster and more robust communications capabilities. As a result, the IoT and edge frameworks continue to advance and gain new capabilities.

How Does an Edge Gateway Work?

In order for IoT devices to deliver real value, there must be a way to connect the edge to the cloud and corporate data centers.

An edge gateway serves this purpose. After IoT edge devices collect data and local processing takes place—either on the device or within a separate device such as a smartphone or cloudlet—a gateway manages the flow of data between the edge network and a cloud or data center. Using either conventional coding or machine learning capabilities, it can send only necessary or optimal data, thus optimizing bandwidth and cutting costs.

An edge gateway also interacts with IoT edge devices downstream, telling them when to switch on and off or how to adjust to conditions. When there’s a need for data it can ping the device.

This enables analytics and machine learning on the edge, the ability to isolate devices, manage traffic patterns more effectively, and connect the gateway to other gateways, thus establishing a larger and more modular network of connected devices. As a result, an IoT framework can operate in a highly dynamic way.

What is an Edge Network?

An edge network resides outside or adjacent to a centralized network. Essentially, the edge network feeds data to the main network—and pulls data from it as needed. Early edge networks encompassed content delivery networks (CDNs) that helped speed video delivery to mobile devices.

But today’s edge networks are increasingly modular and interconnected—and carry a broad array of data. Today’s software-defined networking tools deliver enormous flexibility, scalability and customization for edge networks. In many cases, application programming interfaces (APIs) extend the reach of an edge network while automating workflows.

Edge networks can support an array of advanced capabilities, including traffic scalability and load balancing, context-aware routing that allows data to follow the most efficient path and avoid disruptions, and real-time controls that make it possible to change rules, logic and programming dynamically—depending on internal needs or external conditions.

Advanced edge networks support edge computing capabilities. This makes it possible to run algorithms and applications on the edge—and process and distribute data in more dynamic ways.

Edge Computing vs. Cloud Computing

There are fundamental differences between cloud computing and edge computing. The former relies on a central computing model that delivers services, processes and data services, while the latter refers to a computing model that’s highly distributed.

Edge environments typically strive to move applications and data processing as close to the data-generation site as possible. As robotics, drones, autonomous vehicles, digital twins and numerous other digital technologies mature, the need to handle computing outside the cloud grows.

Not surprisingly, organizations typically use both cloud and edge networks to design a modern IoT framework. The two technology platforms are not oppositional; they are complementary. Each has a role in building a modern data framework.

While edge computing can deliver a more agile and flexible framework—and reduce latency on IoT devices—it’s not equipped to accommodate enormous volumes of data that might feed an analytics application or smart city framework. What’s more, cloud bandwidth is highly scalable and cloud computing often supports a more streamlined IT and programming framework.

What are the Benefits of Edge Computing?

As organizations wade deeper into the digital realm, edge computing and edge technologies eventually become a necessity. There’s simply no way to tie together vast networks of IoT edge devices without a nimbler and more flexible framework for computing, data management and running applications outside a datacenter. Edge computing boosts device performance and data agility. It also can reduce the need for more expensive cloud resources, and thus save money.

Also, because edge computing networks are highly distributed and essentially run as smaller interconnected networks, it’s possible to use hardware and software in highly targeted and specialized way.

This makes it possible, for example, to use different programming languages with different attributes and runtimes to achieve specific performance results. The downside is that heterogeneous edge computing frameworks introduce greater potential complexity and security concerns.

Edge Computing Security Concerns

In fact, physical and virtual security can pose a significant challenge for organizations using IoT edge devices and edge computing networks. There are a number of potential problems.

One of the biggest risks is dealing with thousands or even hundreds of thousands of sensors and devices from different manufacturers that rely on different firmware, protocols and standards. Adding to the problem: many organizations struggle to track edge IoT devices and other assets. In some cases, organizations wind up with different business or IT groups setting up devices that operate independent of each other.

Edge computing devices present other security challenges. Since many of them lack an interface, it’s necessary to manage security settings using outside devices and programs. When an organization deploys a large number of these devices, the security challenges become magnified. As a result, it’s important to focus security on a number of factors and issues, including device firmware and operating systems, TCP/IP stacks, network design and data security tools, such as encryption at rest and encryption in motion and data tokenization.

Network segmentation is also critical. It’s wise to isolate key systems, components and controls—and have ways to shut down a node or system that has been attacked. By segmenting and air gapping groups of devices and systems, it’s possible to prevent a breach or failure at one point in the network that could lead to the failure for the entire edge computing platform.

In addition, it’s critical to use standard security tools and strategies such as auditing the network and devices, changing passwords, disabling unneeded features that may pose a risk, and retiring devices that are no longer needed.

Edge Computing Companies

Over the last few years, numerous edge computing companies have entered the edge computing space. These vendors address different market niches. Some, such as Dell, Cisco, HPE sell networking and computing equipment that supports various aspects of edge and IoT frameworks, ranging from control systems to telecommunications.

Others, such as AWS, Microsoft Azure and Google Cloud, deliver cloud-based software and services that support IoT and edge functionality—including device management, machine learning and specialized analytics. Still others, such as PTC ThingWorx and Particle, deliver sophisticated platforms that connect and manage large numbers of edge IoT devices.

The Edge Continues Expanding

Organizations of all shapes and sizes benefit from a clear strategy for navigating edge computing and IoT devices. Over the next few years, the need to processes data on the edge will grow. But it isn’t only the volume of data that’s important. It’s also the velocity of data moving within organizations and across business partners and supply chains.

As digital frameworks evolve and need to compute within decentralized environments grows, edge infrastructure becomes indispensable.

]]>
Best IoT Platforms & Software https://www.datamation.com/networks/iot-platforms/ Tue, 23 Feb 2021 23:04:18 +0000 https://www.datamation.com/?p=20758 Over a few short years, the Internet of Things (IoT) has evolved from an intriguing concept with limited capabilities into a full-fledged platform for IT and business.

Organization are increasingly turning IoT platforms to perform an array of tasks, ranging from real-time inventory visibility and predictive maintenance systems to energy and smart buildings. They’re also adopting Industry 4.0 tools like digital twins. It’s safe to say that no sector has been left touched by the IoT – cloud computing in particular is linked to IoT.

IoT software plays an important and growing role in connected systems. They introduce an architecture that tames some of the rough edges associated with connecting devices, standards, protocols and software systems. Instead of building an IoT framework entirely from scratch, they tie together device management, data collection, data analytics, machine learning (ML), IT integration and cybersecurity. IoT and Industrial IoT (IIoT) platforms simplify tasks, trims costs and drive performance gains.

Of course, finding the right platform is essential. Pricing models, standards, cloud connectivity and elasticity, system flexibility and security methods vary greatly. Some platforms excel at connecting sensors while others are focused more on communications and data processing.

As a result, it’s important to consider what your organization’s requirements are for hardware, data access, reporting, and budgeting before selecting an IoT platform. Different business models and different IT infrastructures are better suited to one platform or another. 

How to Select the Best IoT Platform

There are several factors to consider when analyzing the IoT platform marketplace. These include:

What are we trying to achieve?

As with any information technology solution, it’s important to start with an assessment of how the IoT can automate and improve business practices and processes. This includes productivity gains, faster and better functionality and lower costs.

What technology do we already have in place?

It’s essential to analyze existing IT, cloud and network frameworks to determine their fit with an IoT platform. Along the way, an organization must determine whether current IoT devices will work with the framework and which, if any, require upgrading, retrofitting or outright replacement.

Is the platform flexible?

The IoT space is evolving at a furious rate. While all these vendors support some level of flexibility, not all approaches are equal and some are a better match for certain IoT configurations. There’s also a need to support a growing array of open source components. Matching your roadmap with theirs is essential.

What type of data analytics and machine learning (ML) does it support?

IoT frameworks are designed to automate data collection and processing on the edge. Machine learning is a key part of this picture. As a result, it’s important to understand whether an IoT platform supports ML.

What security is in place?

The IoT is notoriously weak in regard to cybersecurity. Device manufacturers rely on various standards and approaches, which often results in gaps and vulnerabilities. A platform may provide some help.

What’s the vendor’s strategy and roadmap?

It’s always wise to survey the vendor to determine how it approaches updates, patches, security issues and other factors. Similarly, it’s important to understand how it handles customer support and how it sees the platform evolving.

What’s the cost and the ROI?

It’s vital to consider the initial cost of an IoT platform but also total cost of ownership (TCO) and what type of return on investment it can deliver to your organization.

Here are ten leading IoT platforms:

Jump to:

Leading IoT Platforms

AWS IoT

AWS IoT is designed to auto-provision, manage and support connected systems from the edge to the cloud. It includes analytics and data management features, tools for integrating devices, and multi-layered security mechanisms such as authentication, encryption and access controls. The focus is on industrial, connected home and commercials applications. AWS IoT integrates with other AWS solutions and components as well as open source frameworks.

Pros

  • Highly scalable cloud infrastructure supports billions of devices and trillions of messages.
  • Highly specialized tools and device software streamline workflows and processes.
  • Offers pay-as-you-go pricing with templates and ready-build solutions for specific industries.
  • AWS has partnerships with top IoT industry vendors, including Ayla, Bosch, Domo, Deloitte, Kinesis, Wipro and Verizon.

Cons

  • Users report that it can be difficult to setup and use.
  • Some users complain that documentation and product support are at times lacking.
  • Debugging software and connections can be a problem.
  • IIoT capabilities are not as fully built out.

Ayla Agile IoT Platform

This cloud-based platform-as-a-service framework supports commercial and industrial solutions suited to a variety of vertical industries, including food services, appliances and manufacturing. Ayla Agile IoT Platform addresses edge connectivity, device management, data aggregation and processing, and enhanced security functions.

Pros

  • Partnerships in place with leading cloud and service providers, including AWS, Google Cloud, IBM and Qualcomm. A cloud connection agent simplifies connectivity and support.
  • Receives high marks for ease of use and the large number of devices it supports.
  • Users report an intuitive user interface (UI) and strong notification and reporting capabilities, including filters and drill-down views of devices.
  • Offers digital twins and advanced diagnostic functions.

Cons

  • Some complain that the platform lacks desired features and capabilities.
  • The focus is on three primary areas: home automation systems, discrete and process manufacturing, and telecoms/Internet Service Providers.
  • May require integration with more advanced solutions to deliver the full functionality required by a business.

Azure IoT

The cloud-hosted platform ties together numerous templates, tools and open source components to support IoT initiatives ranging from condition monitoring to predictive maintenance. Azure IoT Hub manages bidirectional communications to and from devices, including provisioning and authentication. The platform supports numerous industries and use cases, including process manufacturing, energy, healthcare, retail and transportation.

Pros

  • The platform supports hybrid IoT applications through Azure IoT Edge and Azure Stack.
  • Offers built-in device management and provisioning to connect and manage IoT devices at scale.
  • Supports digital twins and offers strong analytics and ML support.
  • Includes a security-enhanced communication channel for sending and receiving data from IoT devices.

Cons

  • Can be complex to set up and use. An extensive knowledge has lots of information about the platform but users report difficulty finding answers.
  • Some users complain that the platform lacks key operational functionality.

Cisco Kinetic

The IoT operations platform handles complex gateway management tasks, including provisioning and monitoring. It also tackles edge and fog processing of data and includes a data control module that facilitates the movement of data using policy enforcement mechanisms. Kinetic supports large IoT deployments and rules-based policy management across multi-cloud environments.

Pros

  • The platform is highly scalable and modular. It can connect a wide-range of devices.
  • Provides deep visibility into nodes, microservices and other components—with a minimal footprint.
  • Strong real-time data visualization with access to various data sources, including IoT devices and databases.
  • Offers pre-built widgets and templates for data visualization and other tasks.

Cons

  • Lacks specialized IoT components that may be necessary for an IoT project.
  • Users report that setting up and using the platform can be challenging.
  • There are some limitations for non-Cisco networking and infrastructure hardware.

Google Cloud IoT

Google Cloud IoT delivers an intelligent platform for building and managing a highly scalable network of IoT devices. It’s designed to manage devices and data on the edge and into the cloud. The platform offers strong analytics, ML and automation features that are valuable for predictive maintenance, real-time asset tracking, logistics and supply chain management and smart city and building initiatives.

Pros

  • Offers a powerful AI platform, including more advanced Vision AI and Video AI that drive insights from images and video in the cloud and on the edge.
  • Offers robust IoT developer kits.
  • Extensive ecosystem of partners, including Accenture, NetApp, Palo Alto Networks, Siemens and Sigfox.

Cons

  • Users report challenges related to setting up, configuring and using the platform.
  • Security and privacy settings can be confusing, especially for APIs and authentication.
  • Limited support for using and importing datasets from outside the Google ecosystem. 

IBM Watson IoT

IBM’s fully managed and cloud-hosted IoT platform delivers a cloud-hosted environment that tackles everything from device registration and authentication to connectivity and data management/analytics. Areas of specialization include enterprise asset management, facilities management and systems engineering.

Pros

  • Highly scalable and flexible.
  • Supports powerful AI-driven analytics and ML functions that can be adapted to an industry or business.
  • Watson cognitive APIs support interconnectivity across devices and vendors.
  • The platform supports blockchain.

Cons

  • Some users report a steep learning curve.
  • Limited data storage formats and options can make global data management challenging. 

Oracle IoT Intelligent Applications

Oracle delivers broad and deep visibility into IoT devices. The cloud-based platform is optimized for smart manufacturing, connected assets, connected logistics, workplace safety and other tasks. It supports the use of real-time data for visualizations, mapping and automation.

Pros

  • Built-in integrations and API framework for ERP, supply chain management (SCM) and other enterprise systems and data.
  • Offers pre-built dashboards and widgets that facilitate deep visibility into data and events.
  • Supports digital twins and 3D visualizations.
  • Offers pre-build threads for enterprise applications such as manufacturing, maintenance, transportations and warehouse management.

Cons

  • Some functions and features are limited to using an Oracle infrastructure.
  • May require third party device management solutions for specialized needs and requirements.
  • Oracle’s IoT Cloud Service doesn’t support a complete range of IIoT protocols and third party IoT products. 

Particle

Particle offers a broad and extensive cloud-to-edge framework for managing connected devices. It accommodates a broad array of tasks, including asset tracking, fleet management, predictive maintenance, environmental monitoring, real-time order fulfillment and remote monitoring and controls.

Pros

  • Global IoT connectivity through Wi-Fi, cellular and BLE in over 150 countries.
  • Strong security features, including built-in device encryption, PKI authentication, robust security logging and strong privacy controls.
  • Strong analytics and ML features.
  • Excellent scalability, including auto-provisioning and device scaling.
  • Large community of users and strong support capabilities.

Cons

  • Complex configurations can be prone to disruptions and interruptions.
  • High upfront costs but these are tempered by reduced operations and development costs.
  • Users complain that the environment can be complex. 

PTC ThingWorx

The ThingWorx platform is a robust and fully developed industrial IoT (IIoT) solutions. It addresses a wide range of manufacturing, service and engineering use cases through end-to-end device auto-provisioning and management. ThingWorx specializes in remote access monitoring, remote maintenance and service, predictive capabilities and other functions on-premises and in the cloud.

Pros

  • Has an extensive global ecosystem of technology partners and systems integrators.
  • Uses more than 150 drivers to boost standardization and connectivity across heterogenous environments.
  • ThingWorx Flow offers powerful orchestration capabilities in a visual environment.
  • Highly rated service and support.

 Cons

  • Users say the digital twin component can be difficult to integrate and use with some industrial applications.
  • Lacks some standardized tools for builds and deployments as well as code analysis and verification.
  • Some users complain about a lack of tools and widgets to manage IoT devices.

SAP Internet of Things

The platform provides cloud, edge and data technologies required to build out the IoT. It also aggregates IoT data to drive analytics, machine learning, and blockchain technologies through SAP Analytics Cloud. In addition, SAP offers various microservices that can be deployed across edge computing and IoT devices. These can be used for smart systems and supply chain optimization.

Pros

  • Strong IoT data management, analytics and ML capabilities. Supports data persistence, streaming analytics, predictive analytics, and contextual features.
  • Supports digital twins through sensor and contextual business data.
  • Strong automation capabilities through IoT application templates.
  • Highly scalable.
  • Top rated service and support.

Cons

  • Users report difficulties integrating components with legacy IT and non SAP components.
  • May require significant customization in order to build out an IoT ecosystem.
  • Some users report difficulties finding features and navigating to desired locations within the platform.
  • Pricing model can be complex.

Best IoT Platform Comparison Chart

IoT Vendor

Pros

Cons

 

AWS IoT

  • Excellent templates
  • Pay as you go pricing
  • Broad ecosystem of partners
  • Debugging can be difficult
  • IIoT capabilities are somewhat limited
 

Ayla Agile IoT Platform

  • Strong partnerships
  • Users find it easy to use with an intuitive interface
  • Strong digital twin support
  • Users would like to see additional features
  • IoT focus is narrower than other vendors
  • May require additional integration
 

Azure IoT

  • Powerful device management
  • Strong digital twin support
  • Focus on security
  • Some users complain about missing features and functionality
  • Can be expensive
 

Cisco Kinetic

  • Scalable and modular platform
  • Delivers deep visibility into devices and microservices
  • Pre-built templates for visualizations and other tasks
  • Implementing and using the platform can be difficult
  • May not support non-Cisco networking components
  • No IoT device hardware offered
 

Google Cloud IoT

  •     Best in class AI, including vision and video AI
  • Robust developer kits
  • Extensive partner network
  • Security and privacy controls can be difficult and confusing
  • Limited support for data residing outside the Google ecosystem
 

IBM Watson IoT

  • Strong support for AI driven analytics and ML
  • Robust APIs support interconnectivity
  • Support for blockchain
  • Some users complain about limited storage options
  • Expensive
 

Oracle IoT Intelligent Applications

  • Build in integrations and APIs
  • Excellent dashboard and widgets
  • Strong support for digital twins and 3D visualizations
  •  Some functions and features aren’t available outside an Oracle infrastructure
  • May require 3rd party device management add-ons
  • Doesn’t support a full range of IIoT protocols
 

Particle

  • Strong security features
  • Highly scalable and flexible
  • Large user community
  •  Expensive
  • Steep learning curve for certain configurations
 

PTC ThingWorx

  • Focus on standardization
  • Powerful orchestration tools
  • Top notch service and support
  • Lacks standardized tools for certain tasks
  • Lack of widgets
  • Expensive
 

SAP Internet of Things

  •  Strong digital twins support
  • Powerful automation
  • Highly scalable
  • Excellent service and support
  • Pricing model can be complex
  • Users say navigation can be a challenge
  • May require heavy customization

 

]]>
Top APM Tools & Software https://www.datamation.com/applications/top-apm-tools/ Wed, 23 Dec 2020 06:00:00 +0000 http://datamation.com/2019/06/27/top-app-monitoring-software/

Given the speed and complexity of the cloud computing era, application performance monitoring (APM) is an increasingly important element of successful business. When critical applications like Big Data software runs slowly, productivity drops, IT costs rise, and employees and customers can become frustrated.

Application performance monitoring targets these issues. It provides tools for managing code, understanding application dependencies, viewing transaction times and other technical indicators, and gauging overall user experience.

APM tools help a business know when it’s on track with overall objectives, but they also aid developers in understanding whether or not they’re coding effectively. For example, a tool may track data analytics metrics in relation to the customer journey and their overall experience. Or they may provide insight into performance issues related to servers, storage, software as a service, or other factors. On the other hand, it might deliver code level visibility into Java or .NET apps and spot problems.

These tools typically fall into three basic categories: metrics based, code level performance and network based. In the solutions below, you’ll see that many tools combine some element of each of these elements.

Although every organization can benefit from APM tools, finding the right vendor and specific solution can prove challenging. Different products approach application performance monitoring in different ways, including the type of infrastructure, level of automation, the use of machine learning, and the ability to integrate with cloud applications.

Choosing the Right Application Performance Monitoring Tool: Three Tips

Understand your APM needs

Organizations should consider these functional areas when selecting a solution:

  • Digital experience monitoring (DEM), which helps optimize performance for a digital agent, human or machine, particularly as an entity connects to enterprise applications and services;
  • Application discovery, tracing and diagnostics (ADTD), which diagnoses processes and examines relationships between application servers, nodes and other systems;
  • Artificial intelligence for IT operations (AIOps). This use of artificial intelligence combine big data and machine learning functionality to support IT operations.

Select the right framework

There are a number of key considerations in selecting an application. These include cloud integration, IoT integration, database support, dashboard visibility and controls, reporting (including historical analytics) code language support, the ability to conduct end-to-end tracing, cross-application tracking capabilities, code-level diagnostics and tracing, and notification and alert capabilities. Each of these elements comprise the framework – and the framework must check enough of your boxes.

Choose the vendor that fits your particular needs

Due diligence is vitally important when selecting a vendor. It’s wise to thoroughly question an APM solutions provider and understand its business philosophy, technology framework and vision for the future. Other important considerations are licensing costs, update policies, service level agreement (SLA) and overall customer support. But most important: are they offering some level of flexibility for your specific business needs? Are they willing to negotiate?

In this Datamation article for application performance monitoring tools we have identified 10 top vendors/tools:

Jump to:

Broadcom (CA Technologies)

Value proposition for potential buyers: Broadcom purchased CA Technologies in late 2018. The platform—available both on premises and as a SaaS solution—delivers core APM, infrastructure, network, end-user, cloud, mainframe and business transaction monitoring within the vendor’s CA Digital Experience Insights (DXI) platform. The focus is heavily on actionable analytics that identify problem points and promote improved digital experiences. Broadcom is ranked as a “Leader” in Gartner’s MQ.

Key values/differentiators:

  • The product offers a comprehensive approach to APM. Over the last few years, the vendor has focused on modernizing its underlying technology architecture, including through a greater use of open source tools. One of the main areas of focus is visual analytics.
  • The vendor also has focused on improving the usability of its solutions through improved assisted triage workflow, Gartner noted. The system is designed to aid in identifying performance anomalies and business transactions that can be further investigated through detailed drill-downs.
  • CA aims to extend its coverage of applications and IT infrastructures. It is continuing to expand and extend core functionality to ingest a broader array of data sources through built-in functionality as well as through open source components.

Cisco (AppDynamics)

Value proposition for potential buyers: Cisco acquired AppDynamics in 2017. The APM solution is available as both an on-premises and SaaS solution. The platform offers core APM monitoring but also analytics tools for tracking end users and various types of infrastructure, including mainframes, cloud, and SAP S/4HANA. Gartner ranks Cisco (AppDynamics) as a “Leader” on its MQ.

Key values/differentiators:

  • Cisco has added machine learning technology to the AppDynamics platform. The vendor also has added features to broaden and deepen support for the cloud. This includes microservices, serverless computing, container and hybrid environments, and cloud-native integrations.
  • The platform offers comprehensive support for tracking business processes and metrics that can aid in supporting IT as well as lines of business and the overall enterprise.
  • Cisco has established a roadmap focused on enhancing its solutions through added cloud capabilities, improved business performance monitoring, an improved UI, and greater support for commercial off-the-shelf (COTS) applications.

Dell Foglight (Quest)

Value proposition for potential buyers: The website monitoring platform focuses on risk assessment, diagnostics, user management and server monitoring for online environments. Quest is designed to simplify IT management by encompassing a framework of data protection, database management, security, performance monitoring and more.

Key values/differentiators:

  • The Quest platform supports nearly every approach and environment, including Active Directory, Azure Active Directly, Exchange, Google, Hadoop, Office 365, Oracle, SharePoint, SQL Server and VMware.
  • The solution supports database monitoring and performance optimization. It includes advanced workload analytics tools that help organizations consolidate and standardize database performance management across diverse multi-platform environments.
  • Foglight offers a “single pane of glass” into heterogeneous virtual environments. This simplifies application monitoring and performance management for various tasks, including asset tracking, changes in machines and VM migrations. The program is particularly adept at displaying information about storage utilization, memory usage, CPU performance, disk inputs/outputs (I/O), and network I/O.

Dynatrace

Value proposition for potential buyers: The privately held firm offers an APM solution that is available on-premises, on a managed services basis or as a SaaS offering. It includes APM, DEM, infrastructure, network monitoring and AIOps capabilities—along with real-time topology and AI algorithms that automatically detect anomalies and understand the business impact across users, applications, and infrastructure. Dynatrace is ranked as a “Leader” in Gartner’s MQ.

Key values/differentiators:

  • The vendor supports a wide array of environments, from mainframe to COTS and SaaS through a combination of legacy and newer technology infrastructure and tools. The firm’s APM approach is supported by its OneAgent architecture, which focuses on total automation and zero configuration.
  • Dynatrace has continually expanded the breadth and depth of the APM solution, including greater support for cloud frameworks. In late 2017, the vendor acquired Qumrun. This added session replay to its DEM module.
  • Gartner noted that the vendor’s roadmap includes expanded support for “multicloud and hybrid architectures. This will support the greater use of purpose-built AI for faster root cause analysis and improved automation and remediation. The vendor is also expanding the use of session replays as part of customer and business journey analysis.

IBM

Value proposition for potential buyers: IBM offers both on-premises and SaaS-based APM solutions. Each uses an approach specifically optimized for the user’s application environment. IBM’s large network of business partners, and the product’s ability to connect with a wide range of products and solutions, makes it a popular choice for larger and mid-size organizations. Gartner ranks IBM as a “Challenger” on its MQ.

Key values/differentiators:

  • The SaaS package is a multi-tenant APM solution that is part of IBM Cloud App Management Base and Advanced. It includes a web-based UI and configurable dashboard that monitors AWS, Azure and other cloud environments using cloud-native APIs.
  • IBM includes RUM synthetic transactions monitoring, log analytics, middleware monitoring, multivariate anomaly detection, and business insight through IBM Business Monitor.
  • IBM is a leader in AI and cognitive computing. It leverages the Watson AI platform within its APM solution. It also incorporates powerful open source tools, such as the Grafana plugin, for advanced monitoring, analytics, and visualizations.

Microsoft

Value proposition for potential buyers: Microsoft delivers full APM support only as a SaaS solution, though the vendor’s older System Center Operations Manager can tackle basic functions. The solution is designed to work with Azure, and it supports .NET and Java applications, along with apps written in Python, Go and Node.js. Microsoft is ranked as a “Challenger” on Gartner’s MQ.

Key values/differentiators:

  • Microsoft integrated its Application Insights tool into Azure Log Analytics to form the new Azure Monitor in 2018. This SaaS-based multitenant APM solution is deeply integrated with Microsoft Azure. It includes DEM, log analytics and cloud-native monitoring for Azure, with support for containers and Kubernetes.
  • The APM solution offers strong integration with Microsoft development tools. It also offers analytics tools that are designed to accommodate large volumes of events, logs, metrics, transactions and security.
  • Microsoft offers a consumption-based pricing model that Gartner describes as a “competitive differentiator.”
  • The roadmap for Microsoft APM revolves around adding algorithms to reduce event noise, improve forecasting performance and detecting anomalies. The vendor also plans to expand support for geographic regions.

New Relic

Value proposition for potential buyers: The vendor offers APM only as a SaaS solution. It is designed to work with cloud platforms such as AWS, Azure and Google Cloud. It supports Kubernetes containers and microservices monitoring, as well as business-centric analytics, infrastructure monitoring and distributed tracing capabilities. Gartner ranked New Relic as a “Leader” in its MQ.

Key values/differentiators:

  • The vendor is known for delivering a robust UI and strong workflow capabilities. It is among the easiest and quickest APM solution to deploy and deliver results. It includes a powerful auto-instrumentation feature that supports major programming languages.
  • Several acquisitions have strengthened the firm’s position in recent months. In addition, New Relic invests heavily in R&D and plans to continue upgrading the platform. This includes stronger root cause analysis features, faster and better anomaly detection and event correlation, and faster incident response capabilities.
  • A February 2019 acquisition of analytics vendor SignifAI added machine learning and AI features to the platform.

Oracle

Value proposition for potential buyers: Oracle is a long-time provider of APM tools. It offers an on-premises solution through Oracle Enterprise Manager (OEM) and a SaaS solution through Oracle Management Cloud (OMC).  The latter platform is a multitenant framework that addresses APM requirements across applications, infrastructure and end-user monitoring environments. Oracle is ranked as a “Challenger” in the Gartner MQ.

Key values/differentiators:

  • Although OMC is optimized for Oracle infrastructure and workloads, it can be used for APM within heterogeneous environments. The solution is able to collect log and metrics data from numerous external sources.
  • The OMC APM solution is available in different configurations that are designed to address different customer needs. This includes analytics requirements and the level of orchestration required to oversee infrastructure.
  • Oracle is expanding the platform to include support for additional modern programming languages through OpenTracing. It is incorporating more robust features and capabilities through proprietary and open source tools.

Riverbed

Value proposition for potential buyers: Riverbed is a long-time provider of APM solutions. It offers several products that address different enterprise requirements. These include both on-premises and SaaS-based solutions for monitoring, analyzing and addressing anomalies and various other challenges. Gartner ranks Riverbed as a “Challenger” in its MQ.

Key values/differentiators:

  • AppInternals is Riverbed’s core APM solution. It offers agent-based, bytecode instrumentation and integrates with several other of the vendor’s products to deliver more comprehensive infrastructure monitoring and application and user management.
  • Gartner gives Riverbed high marks for delivering a consistent user experience across both on-premises and SaaS products. It also praised the company for its use of closely coupled DEM and NPMD functions, and for its ability to handle large volumes of data effectively.
  • Riverbed is working to better unify agents and features within its various APM tools and products, along with microservices and containers. In addition, it is adding support for modern languages.

SolarWinds

Value proposition for potential buyers: The vendor entered the APM market in 2016, after acquiring the assets of AppNeta. It offers powerful tools that span IT networks, infrastructures and applications. These include host agents, SNMP polling and application dependency mapping. The vendor’s solutions are available in both on-premises and SaaS-based versions. The on-premises solution is called Server & Application Monitor (SAM) and the SaaS product is named AppOptics. Gartner ranked the company a “Niche Player” in its MQ.

Key values/differentiators

  • The vendor has added numerous features and capabilities to its AppOptics solution over the last couple of years, including support for containers such as Docker and Kubernetes. It supports code-level instrumentation and infrastructure monitoring.
  • SolarWinds is bolstering analytics capabilities and adding machine learning tools that are designed to automate processes and reduce complexity. This includes combining tracing, metrics, logging and end-user data into a unified workflow, and adding time-series prediction and classification features.
  • The vendor’s tools are designed primarily for small and mid-size organizations. They are ideal for businesses looking to deploy a solution based on a simpler consumption model. SolarWinds is known for solutions that are intuitive and straightforward.

Top Application Performance Management Comparison Chart

Vendor

 

Focus

 

Key Differentiator

 

Key features

 

Broadcom (CA Technologies)

 

 

On premises and SaaS APM that revolves around actionable analytics to improve digital experiences.

 

Offers a digital Experience Insights (DXI) platform that addresses infrastructure, network, cloud, end-users, and business transaction monitoring.

 

Analytics tools; assisted triage workflow that offers drill-down visibility into issues.

 

Cisco (AppDynamics)

 

 

Offers on premises and SaaS APM with monitoring, diagnostics and analytics.

 

Addresses a wide array of APM requirements, including mainframes, clouds, and SAP S/4HANA.

 

Supports business and IT metrics; integrated machine learning and powerful cloud tools.

 

Dell Foglight (Quest)

 

 

A SaaS-based APM solution that risk assessment, diagnostics, user management and server monitoring for online environments.

 

The Quest platform provides tools that consolidate and standardize database performance management across diverse multi-platform environments.

 

Database monitoring; performance optimization; advanced workload analytics.

 

Dynatrace

 

 

APM available on-premises, as a managed service or as a SaaS solution.

 

OneAgent architecture supports a wide array of environments, from mainframe to COTS and SaaS through a combination of legacy and newer technology infrastructure and tools.

 

Offers real-time topology and AI algorithms that automatically detect anomalies and understand the business impact across users, applications, and infrastructure.

 

IBM

 

 

SaaS-based multi-tenant APM solution.

 

Includes a web-based UI and configurable dashboard that monitors AWS, Azure and other cloud environments using cloud-native APIs. Offers strong AI capabilities.

 

RUM synthetic transactions monitoring, log analytics, middleware monitoring, multivariate anomaly detection, and business insight.

 

Microsoft

 

 

Full APM support available through a SaaS solution. Basic APM functionality though System Center Operations Manager.

 

The solution works with Azure and supports .NET and Java, along with apps written in Python, Go and Node.js.

 

Includes DEM, log analytics and cloud-native monitoring for Azure, with support for containers and Kubernetes.

 

New Relic

 

 

SaaS-based APM.

 

Auto-instrumentation framework supports cloud platforms such as AWS, Azure and Google Cloud with an excellent UI and strong workflow features.

 

Dashboard provides deep visibility into applications and performance. Includes machine learning and AI features.

 

Oracle

 

 

Offers on-premises and SaaS-based APM through different applications.

 

Oracle Management Cloud handles Oracle workloads as well as heterogeneous frameworks. The solution is available in different versions for different specific purposes.

 

Strong analytics and orchestration features. Growing support for modern programming languages and open source tools.

 

Riverbed

 

 

Offers on-premises and SaaS APM.

 

Riverbed’s core APM solution, AppInternals, delivers agent-based, bytecode instrumentation and deep integration with the vendor’s other products and tools.

 

Offers coupled DEM and NPMD functions; supports large volumes of data through comprehensive infrastructure monitoring and application and user management.

 

SolarWinds

 

 

Offers on-premises and SaaS solutions.

 

Offers powerful tools that span IT networks, infrastructures and applications. These include host agents, SNMP polling and application dependency mapping.

 

The SaaS solution, AppOptics, supports Docker and Kubernetes, along with code-level instrumentation and infrastructure monitoring.

 

]]>
Top 10 Hyperconverged Infrastructure (HCI) Solutions https://www.datamation.com/data-center/top-10-hyperconverged-infrastructure-hci-solutions/ Tue, 22 Dec 2020 14:48:41 +0000 https://datamation.com/?p=20403 hyperconverged infrastructure (HCI) solution is a primary tool for connecting, managing and operating interconnected enterprise systems in a hyperconverged infrastructure (HCI). The technology helps organizations virtualize storage, servers, and networks. While converged infrastructure uses hardware to achieve this objective, HCI takes a software-centric approach.

To be sure, hyperconvergence has its pros and cons. Yet the advantages are clear: HCI boosts flexibility by making it easier to scale according to usage demands and adjust resources faster and more dynamically. By virtualizing components it’s possible to build more efficient databases, storage systems, server frameworks and more. HCI solutions increasingly extend from the data center to the edge. Many also incorporate artificial intelligence and machine learning to continually improve, adapt and adjust to fast-changing business conditions. Some also contain self-healing functions.

By virtualizing an IT environment an enterprise can also simplify systems management and trim costs. This can lead to a lower total cost of ownership (TCO).  Typically, HCI environments use a hypervisor, usually running on a server that uses direct-attached storage (DAS), to create a data center pool of systems and resources. Most support heterogenous hardware and software systems. The end result is a more flexible, agile and scalable computing framework that makes it simpler to build and manage private cloudpublic clouds and hybrid clouds.

How to Select the Right HCI Solution

A number of factors are important when evaluating HCI solutions. These include:

Edge-core cloud integration. Organizations have vastly different needs when it comes to connecting existing infrastructure, clouds and edge services. For instance, an organization may require only the storage layer in the cloud. Or it may want to duplicate or convert configurations when changing cloud providers. Ideally, an HCI solution allows an enterprise to change, upgrade and adjust as infrastructure needs change.

Analytics. It’s crucial to understand operations within an HCI environment. A solution should provide visibility through a centralized dashboard but also offer ways to drill down into data, and obtain reports on what is taking place. This also helps with understanding trends and doing capacity planning.

Storage management. An HCI solution should provide support for setting up and configuring a diverse array of storage frameworks, managing them and adapting them as circumstances and conditions change. It should make it simple to add nodes to a cluster and support things like block file and object-oriented storage. Some systems also offer NVMeOF (non-volatile memory express over fabrics) support, which allows an enterprise to rearchitect storage layers using flash memory.

Hypervisor ease of use. Most solutions support multiple hypervisors. This increases flexibility and configuration options—and it’s often essential in large organizations that rely on multiple cloud providers. But it’s important to understand whether you’re actually going to use this feature and what you plan to do with it. In many cases, ease of use and manageability are more important than the ability to use multiple hypervisors.

Data protection integration. It’s important to plug in systems and services to protect data—and apply policy changes across the organization. It’s necessary to understand whether this protection is scalable and adaptable, as conditions change. Ideally, the HCI environment can replace disparate backup and data recovery systems. This greatly improves manageability and reduces costs.

Container support. A growing number of vendors support containers, or plan to do so soon. Not every organization requires this feature, but it’s important to consider whether your organization may move in this direction.

Serverless support. Vendors are introducing serverless solutions that support code-triggered events. This has traditionally occurred in the cloud but it’s increasingly an on-premises function that can operate within an HCI framework.

Here are ten leading HCI solutions:

Jump to:

Leading Hyperconverged Infrastructure Solutions

Cisco HyperFlex HX-Series

The Cisco HyperFlex HX data platform manages business and IT requirements across a network. The solution accommodates enterprise applications, big data, deep learning and other components that extend from the data center to remote offices and out to retail sites and IoT devices. The platform is designed to work on any system or any cloud.

Pros

  • The platform includes hybrid, all-flash, all-NVMe, and edge configurations to deliver maximum flexibility and a high level of security, including self-encrypting options.
  • It relies on an integrated network fabric, and powerful data optimization features to deliver hyperconvergence to a wide range of workloads and use cases.
  • HyperFlex HX is highly scalable.
  • The technology supports deep learning on GPU-only nodes.

Cons

  • Requires an integrated Cisco network.
  • Some users find the pricing model confusing and somewhat high.
  • Limitations with analytics.
  • Systems configurations and manageability can present challenges.

 DataCore Software-Defined Storage

Datacore SDS delivers a highly flexible approach to HCI. It offers a suite of storage solutions that accommodate mixed protocols, hardware vendors and more within converged and hyperconverged SAN environments. The software-defined storage framework, SANsymphony, features block-based storage virtualization. It is designed for high availability. The vendor focuses heavily on healthcare, education, government and cloud service providers.

Pros

  • Supports mixed SAN, flash and disk environments.
  • Handles load balancing and policy management across heterogeneous systems.
  • Offers pool capacity and centralized control of primary and secondary storage.
  • Strong failover capabilities.

Cons

  • Some find the user interface daunting.
  • Licensing can be somewhat complex, though the vendor has introduced capacity-based licenses.
  • Some users report difficulties obtaining adequate customer support.

 Dell/EMC VxRail

VxRail delivers a fully integrated, preconfigured, and pre-tested VMware hyper-converged infrastructure appliance. It delivers virtualization, compute and storage within a single appliance. The HCI platform takes an end-to-end automated lifecycle management approach.

Pros

  • Delivers a single point of support by default for all software and hardware.
  • Cloud based multi-cluster management and intelligent upgrade staging.
  • Strong Kubernetes support.
  • Offers a lockstep 30-day synchronous release with VMware vSphere
  • Users report low total cost of ownership

Cons

  • Limited support for mixing older flash clusters and hyper-clusters.
  • Users report some manageability challenges, such as setting up naming schemas.
  • Can be somewhat pricey, depending on the IT environment and use case.

 HPE SimpliVity

HP Enterprise aims to take hyperconverged architectures beyond the realm of software-defined and into the world of AI-driven with SimpliVity. The HCI platform delivers a self-managing, self-optimizing, and self-healing infrastructure that uses machine learning to continually improve. HP offers solutions specifically designed for data center consolidation, multi-GPU image processing, high-capacity mixed workloads and edge environments.

Pros

  • Offers strong storage management, backup and data replication capabilities.
  • Offers a single well-designed interface for the entire solution.
  • Strong partner relationships, including SAP, Microsoft, Citrix, VMware and Docker.
  • Highly scalable and flexible without a penalty for availability.

Cons

  • Some users encounter difficulties moving SimpliVity clusters within the platform.
  • Can be pricey, depending on the use case.
  • Some users complain about the lack of customer and technical support.

 NetApp HCI

NetApp HCI consolidates mixed workloads while delivering predictable performance and granular control at the virtual machine level. The solution scales compute and storage resources independently. It is available in different compute and storage configurations, thus making it flexible and scalable across data center, cloud and web infrastructures.

Pros

  • Delivers strong manageability, granular controls and a high level of flexibility for HCI within a single pane of glass.
  • Automates numerous functions with a strong API framework and ecosystem.
  • Handles numerous types of workloads, including VMware, SQL, Oracle, SAP, Citrix and Splunk.
  • Highly scalable.

Cons

  • Installation and initial cabling can be challenging.
  • Users complain that documentation is sometimes lacking.
  • Some users complain about inadequate security controls and the lack of integration with other security solutions.

 Nutanix AOS

Nutanix offers a fully software-defined hyperconverged infrastructure that provides a single cloud platform for tying together hybrid and multi-cloud environments. Its Xtreme Computing platform natively supports compute, storage, virtualization and networking—including IoT—with the ability to run any app at scale. It also supports analytics and machine learning.

Pros

  • Offers a feature-rich platform that can be applied at scale. The platform is especially adept at handling data compression and deduplication.
  • Strong and easy-to-use management capabilities through a single user interface.
  • Provides automated application management in a full-cloud stack.
  • Users report excellent technical support

Cons

  • Among the more expensive solutions on the market.
  • Users report some problems with complexity and using networking functions, including encryption and micro-segmentation.
  • Users report some difficulties integrating older legacy systems with the HCI environment.

 StarWind HyperConverged Appliance

StarWind offers a HCI appliance focused on both operational simplicity and performance. It bills its all-flash system as turnkey with ultra-high resiliency. The solution, designed for SMB, ROBO and enterprises—aims to trim virtualization costs through a highly streamline and flexible approach. It connects commodity servers, disks and flash; a hypervisor of choice; and associated software within a single manageable layer.

Pros

  • The appliance is highly scalable. It supports numerous disks and flash components, and easily scales by adding extra nodes.
  • It offers attractive pricing and low TCO.
  • The vendor’s ProActive support framework spots abnormalities and anomalies through persistent monitoring and machine learning.

Cons

  • The vendor’s Linux interface isn’t as developed and mature as the Windows interface it offers.
  • Some complaints from users about the interface and manageability functions.
  • Users say documentation could be more complete.

StarWind Virtual SAN

StarWind Virtual SAN is essentially a software version of the vendor’s HyperConverged appliance. It eliminates the need for physically shared storage by “mirroring” internal hard disks and flash between hypervisor servers. The approach is designed to cut costs for SMB, ROBO, Cloud and Hosting providers. Like the vendor’s appliance, StarWind Virtual SAN is a turnkey solution.

Pros

  • Offers a powerful control panel with insight into the status and health of the VSAN.
  • Uses data locality and server-side caching to deliver high performance and fault tolerance.
  • Delivers low overhead and maintenance costs.
  • Users praise the vendors ProActive support, which spots abnormalities and anomalies through monitoring and machine learning.

Cons

  • Some users complain that the licensing framework can be difficult and somewhat restrictive.
  • Lacks some features required for larger enterprises with more complex configurations.
  • PowerShell documentation presents challenges for some users.

 VMware vCenter Server

The vCenter Server delivers centralized visibility as well as robust management functionality at scale. The HCI solution is designed to manage complex IT environments that require a high level of extensibility and scalability. It includes native backup and restore functions. vCenter supports plug-ins for major vendors and solutions, including Dell EMC, IBM and Huawei Technologies.

Pros

  • vCenter can manage up to 70,000 virtual machines and 5,000 hosts across up to 15 vCenter Server instances.
  • Offers templates and RESTful APIs to automate set up simplify deployments.
  • Includes machine learning capabilities.
  • Users praise VMware for streamlined setup, ease of use and performance.

Cons

  • Some users find the user interface confusing and difficult.
  • A faster HTML5 interface lacks key functionality found in the vendor’s Flex interface.
  • Kubernetes functionality only works in the cloud.
  • The solution can be pricey. Licensing is typically suitable only for medium and large enterprise.

 VMware vSAN

vSAN is an enterprise-class, storage virtualization solution that manages storage on a single software-based platform. When combined with VMware’s vSphere, an organization can manage compute and storage within a single platform. The solutions connects to a broad ecosystem of cloud providers, including AWS, Azure, Google Cloud, IBM Cloud, Oracle Cloud and Alibaba Cloud.

Pros

  • Offers powerful features, scales well and delivers excellent flexibility.
  • Excellent user interface.
  • Integrates seamlessly with VMware products but also with numerous partners.
  • vSAN manages all storage functionality. It eliminates the need for additional storage support.

Cons

  • Users cite occasional problems with failure protection and rebalancing components.
  • Upgrades and changes can present challenges.
  • Expensive relative to other solutions on the market. In many cases, the platform requires licenses for multiple VMware components in order to operate.

Stream Analytics Software Comparison Table

Analytics Vendor Pros Cons
Cisco HyperFlex HX-Series

· Supports numerous configurations and use cases

· Highly scalable

· Supports GPU-based deep learning

· Requires Cisco networking equipment

· Pricing model can be confusing

· Some users find manageability difficult

 

DataCore Software-Defined Storage

· Supports mixed SAN, flash and disk environments

· Excels with load balancing and policy management

· Strong failover capabilities

· User interface can be daunting

· Licensing can become complex

· Customer support is inconsistent

Dell/EMC VxRail

· Delivers a true single point of management and support

· Handles multi-cloud clusters well

· Integrates well with storage devices

· Low TCO

 

· Limited support for mixing older flash clusters and hyper-clusters

· Some management challenges

· Sometimes pricey

 

HPE SimpliVity

· Strong storage management, backup and data replication capabilities

· Users like the interface

· Strong partner relationships

· Highly scalable

· Managing clusters can present challenges

· Pricey

· Users cite problems with technical and customer support

NetApp HCI

· Excellent manageability with granular controls

· Strong API framework

· Support for numerous workloads from different vendors

· Highly scalable

 

· Installation and initial cabling can be difficult

· Documentation sometimes lacking

· Users say some security features and controls are missing

 

Nutanix AOS

· Feature-rich platform

· Single user interface with strong management tools

· Users report excellent tech support

· Pricey

·  Users report some complexity with using encryption and micro-segmentation

· Can be difficult to integrate with legacy systems

StarWind HyperConverged Appliance
 

· Highly scalable

· Supports numerous configurations and technologies

· Users report low TCO

· Strong vendor support through always-on monitoring and machine learning

· Linux interface isn’t as mature as the Windows interface

· Some find the interface difficult

· Users say documentation is sometimes lacking

 

StarWind Virtual SAN
 

·  xcellent control panel

· High fault tolerance

· Low overhead and maintenance costs

· Strong vendor support through always-on monitoring and machine learning

· Licensing framework can be difficult and restrictive

· Lacks some features important for large enterprise

· PowerShell documentation can be challenging

VMware vCenter Server
 

· High capacity

· Strong APIs

· Machine learning features

· High performance

 

· Interface can present challenges

· Kubernetes works only in the cloud

· Pricey

 

VMware vSAN

· Powerful features

· Highly scalable and flexible

· Integrates with numerous partners

· Consolidates storage support

· User cite problems with failure protection and rebalancing

· Upgrade may present problems

· Pricey. Requires multiple licenses from VMware for various needed modules

 

]]>
Best Stream Analytics Software https://www.datamation.com/big-data/best-stream-analytics-software/ Thu, 17 Dec 2020 14:54:38 +0000 https://datamation.com/?p=20406 Stream analytics software analyzes current and historical data as it travels across networks, into and out of databases and through application programming interfaces (APIs).

As a key component of data analytics, this ability to monitor and understand data in real time is at the center of today’s digital enterprise. But achieving success is a growing challenge, particularly as the volume of data grows. Crucial to the success of any Big Data project, stream analytics monitors events and information exchanges in real time. They provide alerts and notifications when certain conditions take place.

As a result, stream analytics software are useful for a wide array of enterprise data tasks. These include geospatial analysis, understanding social media streams, tying together telemetry data from IoT devices, predictive analytics, spotting fraud, real-time point of sale and inventory analysis, and remote monitoring and maintenance tasks.

Some tools offer visualization features that allow users to view complex relationships among systems, connected devices and various types of data. Many rely on widely used frameworks such as Apache Kafka, SQL and JavaScript. The common theme for all stream analytics systems is that their data processing engines are designed to handle enormous volumes of data streaming from multiple sources simultaneously. Stream analytics is particularly powerful when it operates in the cloud.

How to Select the Best Stream Analytics Software For Your Company

There are several crucial factors to consider when selection a stream analytics platform. These include:

Compatibility:It’s critical to survey enterprise data sources, map out connection points for applications and systems, and thoroughly understand what data streams are important—and for what purposes. Building an end-to-end pipeline requires support for coding languages, database formats and more.

Features: An organization should ensure that the package offers the right set of features, and they are robust and flexible enough to provide optimal results. Key features frequently include visualization dashboards, rich reporting capabilities, integrated development tools, data preparation and enrichment capabilities, and automation through machine learning.

Performance and reliability: Not only must a stream analytics software package operate with ultra-low latency, it has to provide the flexibility and scalability to add, subtract and change inputs and connections points—including message brokers and outside processing engines. Some packages also have built-in recovery capabilities.

Cost: It’s wise to view stream analytics in the context of total cost of ownership. Some packages now operate on a utility model—you pay for what you use and the streaming units you consume. Others use a more conventional licensing approach.

Security and compliance:Look for a package that incorporates incoming and outgoing encryption, as well as processing in memory so that data isn’t stored with a cloud provider. Equally important: ensure that a package adheres to all regulatory and compliance standards. Look for compliance certifications.

Top Stream Analytics Software

Here are ten leading stream analytics solutions to consider:

Jump to:

Amazon Elasticsearch Service

The managed service delivers a straightforward way to deploy, operate and scale Elasticsearch clusters in AWS cloud. It provides direct access to the Elasticsearch APIs so that existing code and applications work seamlessly with the service. The platform offers an open-source search and analytics engine that focuses on use cases such as log analytics, real-time application monitoring, and clickstream analysis.

Pros

  • Users can set up and configure a domain in minutes. It supports programmatic access through AWS CLI or the AWS SDKs.
  • The platform offers a high level of scalability, including support for numerous CPU, memory and storage configurations.
  • Offers up to 3 PB of attached storage.
  • Provides strong security, including identity and access controls; encryption of data at rest and in motion; index-level, document level and field-level security; and audit logs.

Cons

  • The platform may present a formidable learning curve.
  • Search queries and indexing can be difficult. If these processes aren’t set up correctly they can impact performance and results.
  • Some users find the interface daunting and have trouble customizing the service to the extent they desire.

Amazon Kinesis

The platform collects and processes large data streams in real-time. Users can create applications that read data as records. These applications use the Kinesis Client Library to run Amazon EC2 instances. Kinesis supports dashboards, dynamic alerts, dynamic pricing and advertising strategies, along with many other functions. It supports data management across other AWS services.

Pros

  • Supports live metrics and reporting.
  • Accommodates complex stream processing, including aggregating multiple streams. This allows more robust downstream processing.
  • Kinesis offers a flexible approach, including support for data sources pushing data directly into a stream.


Cons

  • Can present a formidable learning curve. Some users also report difficulty with documentation.
  • Can be costly and require significant input if an organization has a large number of data sources and requires a larger number of shards.
  • Some users report that extended fan-outs are difficult to manage.

Azure Event Hubs

Microsoft bills Azure Event Hubs as a “scalable event processing service that ingests and processes large volumes of events and data, with low latency and high reliability.” The big data streaming platform and event ingestion service processes millions of events per second, typically in an Azure cloud. It delivers low latency and strong integration with connected data sources.

Pros

  • Uses the Kafka protocol to configure existing Apache Kafka applications to talk to Event Hubs. It also supports .NET, Java, Python, JavaScript.
  • Azure Event Hubs is a highly scalable framework that can extend to terabytes. An auto-inflate feature simplifies and streamlines scaling.
  • Strong support for telemetry sharing, user telemetry processing and strong transaction processing, with live dashboards.

Cons

  • The platform may require custom coding to support more advanced functionality.
  • Some users report difficulty with the interface and find the learning curve difficult.
  • Non-Azure cloud users may face increased difficulty using certain functions, including scheduling.

Azure Stream Analytics

The solution relies on a complex event processing engine to ingest high volumes of data from diverse sources in real-time. It extracts data from devices, sensors, clickstreams, social media feeds, and enterprise applications. This makes it ideal for numerous scenarios, ranging from fleet management and predictive maintenance to point of sale and IoT.

Pros

  • The platform can run in the cloud or on the intelligent edge. It uses the same tools and query language for both.
  • Azure Stream Analytics delivers a high level of configurability and scalability.
  • It integrates seamlessly with various Azure services and adds them to menus automatically.

Cons

  • Doesn’t support auto-scaling. Users must configure streaming units manually.
  • Some users report crashes when the service encounters invalid and malformed data sets.
  • Lacks some of the advanced features required for more advanced IoT implementations.

Confluent

The vendor offers both fully managed and self-managed service options within an open-source framework. A SQL base allows user to build streaming analytics applications that monitor and manage data and events in real-time. The platform ties into the Apache Kafka ecosystem to support highly complex tasks across numerous industries and business environments.

Pros

  • Provides powerful tools, features and capabilities to manage Kafka clusters.
  • Strong end-to-end visibility and manageability.
  • Powerful scaling functions due to numerous built-in connectors.

Cons

  • A complex platform that can present learning challenges.
  • Some users report difficulty testing within the platform.
  • Users say that role-based-access controls (RBAC) present some challenges.

Google Cloud Pub/Sub

The asynchronous messaging service is designed to decouple services that produce events from services that process events. It’s frequently used as messaging-oriented middleware or for event ingestion and delivery for streaming analytics pipelines.

Pros

  • High availability and consistent performance at scale.
  • Ease of configuration and a high level of flexibility.
  • Strong functionality along with tight integration with numerous other products and data services.

Cons

  • Some find the user interface confusing and have difficulty managing certain features.
  • Can be pricey for certain types of implementations and use cases. Some users find the pricing framework confusing.
  • Can be difficult to use without customizations.

IBM Streaming Analytics

IBM Streaming Analytics is equipped to analyze and correlate a broad range of streaming data, including unstructured text, video, audio, geospatial, and sensor data. It features real-time analysis of data in motion. It can analyze millions of events per second, enabling sub-millisecond response times. It’s available with IBM Cloud Pak for Data-as-a-Service.

Pros

  • Receives high user ratings for capacity, flexibility and scalability.
  • Easy to integrate with other IBM cloud services.
  • Offers a large set of optimized and tested toolkits.
  • Active developer community that contributes packages and solutions.

Cons

  • Numerous users report that documentation is sometimes lacking.
  • Can be expensive to operate in production environments.
  • Some users find it challenging to write complex business rules into the platform.

Kibana

The open-source data visualization dashboard is designed to handle Elasticsearch data and navigate the elastic stack. It reaches across documents and data sets to deliver numerous visualization formats, including histograms, line graphs, pie charts, and sunbursts. It also accommodates location analysis, time series models and machine learning.

Pros

  • Kibana is flexible and supports a high level of customization, including custom actions.
  • Delivers role-based and highly granular access controls.
  • Offers robust dashboards and built-in drill-down features that allow users to explore data in deeper ways.


Cons

  • The platform trails competitors for ease of setup and use.
  • Can consume a high level of computing resources in certain situations.
  • Search filters and notifications can be limited within certain scenarios.

Lenses

Lenses offers a developer workspace for building and operating real-time applications on Apache Kafka Connect and Kubernetes infrastructure. The platform is available on premises and in the cloud. It supports SQL-based real-time applications with centralized schema management.

Pros

  • Lenses receives high marks from users for ease of use and quality of support.
  • It offers a secure portal that allows users to configure, deploy and manage hundreds of Kafka Connect-compatible connectors, with integrated error handling.
  • Includes Google-like search and automatic data discoverability of data entities and metadata generated by your real-time applications.

Cons

  • Trails other stream analytics vendors for ease of setup.
  • Some users would like to see a richer set of features and capabilities for supporting DataOps.

TIBCO Streaming

TIBCO Streaming delivers real-time enterprise-grade streaming analytics that reaches across the organization and out to the IoT. The cloud-ready solution supports the development of affordable real-time applications. It analyzes millions of events per second and provides ultra-fast continuous querying capabilities.

Pros

  • Handles highly complex data transformations.
  • Offers powerful user controls to manage ad-hoc queries, control and set business logic, define rules and models, configure charts, change the panel layout, create and manage alerts, and more.
  • Offers more than 150 pre-built adapters and visualization options for Kafka and numerous other formats.
  • Delivers full cloud-enablement with support for Docker.

Cons

  • Some users report a lack of support for managing third party library dependencies.
  • Users report that security controls could be more robust.
  • May lack flexibility for certain configurations, such as reusing modular components.

Stream Analytics Software Comparison Table

Analytics Vendor Pros Cons
Amazon Elasticsearch Service

· Fast Setup

· Highly scalable

· Large storage capacity

· Strong security controls

· Learning curve

·  Queries and indexing can be difficult

·  Interface can be challenging

 

Amazon Kinesis

· Supports live metrics and reporting

· Handles highly complex stream processing

· Flexible

· Learning curve

· Can be pricey

· Large implementations and fanouts can be difficult to manage

Azure Event Hubs

· Strong support for Kafka and development languages

· Highly scalable with large capacity

·  Strong telemetry support

 

· Advanced functionality may require custom coding

· Interface can be daunting

· Limited functionality for non-Azure users

 

Azure Stream Analytics

· Runs in the cloud or on the edge

· High level of configurability and scalability

· Integrates seamlessly with other Azure modules and services

· Lacks auto-scaling functionality

· Crash-prone when it encounters invalid and malformed data

· Lacks advanced features required for IoT projects

Confluent

·  Powerful features and tools

·  Strong end-to-end visibility and manageability

· Powerful scaling functions

 

·  Complexity of platform

·  Testing within the platform may be challenging

·  RABCs can prove difficult

 

Google Cloud Pub/Sub

·  High availability and consistent performance at scale

·  Ease of configuration

·  Flexible

·  Robust functionality

· Expensive for certain uses and configurations

· May require additional customization

· Some find the user interface difficult

IBM Streaming Analytics

 

· Excels in capacity, flexibility and scalability

·  Tight integration with other IBM cloud services

·  Robust toolkit

·  Active developer community

· Users report documentation is subpar

·  Can be challenging to operate in production environments

·  Doesn’t always support complex business rules

 

Kibana

 

·  Highly flexible and customizable

·  Strong role-based access controls

·  Excellent dashboards and drill down capabilities

·  Can be difficult to set up and use

·  May heavily consume computing resources

·  Limited search filters and capabilities within certain scenarios

Lenses

 

·  Ease of use

·  Strong support

·  Robust management portal

·  Strong search capabilities

 

· Setup can be challenging

· Users say they would like to see richer features for DataOps

 

Tibco Streaming

·  Handles highly complex data transformations

·  Strong support for business logic and user controls

·  Full cloud enablement with Docker support

·  Lacks support for third party dependencies

·  Users report that some security features lacking

· Lacks flexibility for reusing modular components

 

]]>
Top AIOps Companies https://www.datamation.com/artificial-intelligence/top-aiops-companies/ Thu, 05 Nov 2020 17:10:17 +0000 https://datamation.com/?p=20421 Artificial intelligence for IT operations (AIOps) taps artificial intelligence (AI) to streamline and simplify information technology (IT) management. The technology collects data across increasingly complex IT infrastructure, identifying key patterns and events, and automating problem resolution. AIOps platforms typically relies on advanced analytics and machine learning tools to identify the root cause of issues and problems—and address them without human involvement.

In recent years, as data analytics has exploded and cloud computing has become commonplace, AIOps has gone mainstream. By 2023, 40% of DevOps teams will augment application and infrastructure monitoring tools with artificial intelligence for IT operations (AIOps) platform capabilities, Gartner’s research noted. Currently, Gartner estimates the size of the AIOps platform market at between $300 million and $500 million per year.

The appeal of AIOps is straightforward. Gartner points out that these platforms enhance “decision making by contextualizing large volumes of varied and volatile data.” However, it also noted that while the space is advancing rapidly and adoption remains on the upswing, “AIOps platform maturity, IT skills and operations maturity are the chief inhibitors to rapid time to value.”

The upshot? It’s critical to understand what your business needs are and what value proposition vendors offer before committing to an AIOps platform.

Jump to:

How to Choose an AIOps Company

If you’re in the market for an AIOps solution, here are some things to consider:

  • A starting point for choosing a vendor and a specific solution is understanding how your current IT infrastructure can benefit from AIOps and what use case serves as a good starting point for replacing rules-based analytics with an automated framework of network diagnostics.
  • Two general categories of AIOps exist: domain-centric platforms with built-in monitoring tools and domain-agnostic stand-alone solutions. Each has tools for ingesting events, metrics and traces. Understanding which delivers bigger benefits can clarify the vendor-selection process.
  • It’s important to select a solution that has business-specific IT service management (ITSM) use cases revolving around task automation, knowledge management and change analysis.
  • Successful implementations enable insights across IT operations management (ITOM) through three crucial aspects of AIOps, Gartner reports. These include observe, engage and act. Ensure that your organization understands how a solution fits—and connects to other tools—before finalizing vendor selection.

Top AIOps Companies

Here are 10 of the top vendors in the AIOps arena, along with some of their top features and selling points.

AppDynamics

Value Proposition: AppDynamics Central Nervous System ranks high among AIOps vendors with its broad and deep views into networks. Its parent company is Cisco Systems, though the solution works across numerous systems and frameworks. Top customers include Alaska Airlines, Paychex and Nasdaq. Gartner ranked AppDynamics among the “Leaders” in its 2020 Magic Quadrant for Application Performance Monitoring. It also ranked as a “Leader” on the G2 Grid for AIOps Platforms and earned 4.2 out of 5 stars at G2 user ratings.

Summary: Central Nervous system focuses on three primary tasks: visibilityinsights and action. It incorporates a cognition engine that delivers cross-domain visibility, insights and automation—along with automated anomaly detection and root cause analysis. This aids in reducing mean time to resolution (MTTR). A serverless APM shows relationships among applications, and promotes deep integrations across numerous partners. This allows users to gain an expansive view of application code and the underlying network. Cisco ACI and AppDynamics integration delivers insights into cloud infrastructure, including network-configured policies and automated security enforcement.

BigPanda

Value Proposition: The vendor has emerged as a popular choice in the AIOps space, with customers such as InterContinental Hotels Group, Foot Locker, United Airlines and Staples. It recently introduced what it describes as the “first Event Correlation and Automation platform powered by AIOps.” It focuses on gleaning insights and resolving IT issues across the entire IT stack and generating unified analytics. BigPanda received 4.1 out of 5 stars by users at G2.

Summary: BigPanda approaches AIOps through a “monitoring, change, and topology” framework that is part of an overall ITSM framework. It uses proprietary “open box” machine learning technology to spot, correlate and resolve problems. Key AIOps capabilities and features include: Open Integration Hub that collects, normalizes and enriches monitoring; Open Box Machine Learning; an operations console that handles bi-directional integrations; and unified analytics. The company claims that its machine learning component reduces noise by 95% or more while nearly eliminating false positives.

BMC

Value Proposition: BMC is a leading player in the AIOps space. It offers several products that map, log and manage IT infrastructure—and it has established partnerships with most major players in networking and clouds. The company’s open data access approach taps multiple data sources for historical and streaming data. Customers include Ingram, Boston Scientific, Carfax, Lockheed Martin and Vodafone. Its products rate receive good to excellent ratings from users at G2 and other rating sites.

Summary: The vendor’s Helix Monitor is an end-to-end service and operations platform that operates under as SaaS model and uses a data agnostic approach. The solution relies on a containerized, microservices architecture with open APIs and customizable dashboards. It is designed to provide broad monitoring and event management with integrated ITSM and ITOM. The vendor claims that its AIOps solutions reduce noise by about 90%, trim time to identify root cause by 60%, and slash event remediation MTTR by 75%. The company offers other tools, including TrueSite Operations Management, which taps machine learning (ML) and advanced analytics for more holistic monitoring and event management.

DataDog

Value Proposition: DataDog is a SaaS platform that delivers real-time application and IT monitoring along with log management and automation. It boasts major customers such as Peloton, 21st Century Fox, Samsung and Whole Foods Market. The company was ranked as a Forrester Wave Leader in the 2019 in the Intelligent Application and Service monitoring category. It’s also ranked as a “Leader” in the G2 Grid. The vendor receives 4.2 out of 5 stars at the G2 user ratings site.

Summary: The vendor supports visibility into all modern platforms and applications. It includes robust tools for monitoring, troubleshooting and optimizing performance. This includes log analysis that analyzes and explores data in context. The result is end-to-end proactive monitoring that detects and fixes performance issues through AI-powered self-maintenance tests. The platform also offers an assortment of tools to correlate frontend performance with business impact.

DynaTrace

Value Proposition: The vendor offers a full-stack and highly automated AIOps solution that includes Davis, an assistant that continually processes billions of events and dependencies in milliseconds using AI and open APIs. This allows it to identify IT problems and deliver more precise root analysis. Dynatrace AIOps customers include industry giants such as Kroger, Citrix and Experian. Gartner ranked the vendor a Magic Quadrant 2020 “Leader” for APM. The firm also ranked as a “Leader” in the G2 grid. It receives 4.5 out of 5 stars from users at G2.

Summary: DynaTrace offers several products designed to improve IT monitoring and performance. The AIOps platform, using Davis, takes an all-in-once approach that identifies precise root cause, tackles open ingestion, handles orchestration and addresses topology/dependencies across systems, including clouds and mainframes. The AIOps solution features auto discovery, advanced event analytics, anomaly detection and predictive capabilities. The AI assistant generates topology visualizations and business impact analysis data.

Moogsoft

Value Proposition: The company’s approach is built on an “advanced self-servicing AI-driven observability platform” that’s designed to deliver deep and real-time visibility into IT issues. The Moogsoft solution is designed for software engineers, developers and operations staff. Major customers include Qualcomm, Verizon Media, Fannie Mae and KeyBank. The solution is highly rated among users at G2, with 5.5 out of 5 stars.

Summary: Moogsoft provides a high level of automation for end-to-end events through its cloud-native AI and ML “Observability” platform. It collects data from numerous sources and events and correlates them through pattern discovery to deliver real-time insights. The solution is designed to identify root causes, use collaboration methods to ensure the right people receive notifications, and filter out noise and reduce alerts so that teams can tackle the most urgent matters. It delivers high automated remediation for proactive incident resolution.

New Relic

Value Proposition: New Relic focuses on applied intelligence, which aims to detect, understand, and resolve incidents faster through noise reduction and deeper insights. Major customers include American Eagle Outfitters, Hearst and H&R Block. New Relic APM receives a 4.3 out of 5-star rating from users at review site G2.

Summary: New Relic offers a comprehensive list of features for its AIOps platform. This includes availability testing, event logs, event-based notifications, performance metrics, real time monitoring, transaction monitoring, and uptime reporting. The platform offers automated anomaly detection, including highly flexible proactive detection through real-time failure warnings and deep incident intelligence. Applied intelligence offers guidance and analysis designed to speed incident resolution.

PagerDuty

Value Proposition: The company’s focus is on a single platform designed to keep digital systems running all the time and in perfect order. Cloud-native PagerDuty is built to work straight out of the box. It offers more than 370 integrations, including ServiceNow, Slack, Zendesk, AWS, Zoom and many others. Customers include American Express, BBC, Doordash and Netflix. PagerDuty is ranked as a “Leader” on the G2 Grid. It receives a 4.5 out of 5 stars at the G2 user rating site.

Summary: The platform offers powerful features, including on-call management, incident response, event intelligence and analytics. The Event Intelligence module reduces noise and directs insights to the right team for faster and better event resolution. The analytics feature uses pre-build metrics and prescriptive dashboards to deliver broader and deeper insights. The vendor boasts that data science knowledge isn’t needed.

ScienceLogic

Value Proposition: ScienceLogic Platform offers a rich array of IT infrastructure monitoring and remediation tools, including bandwidth monitoring, diagnostics, IP monitoring, real-time analytics resource management, server monitoring, SLA monitoring, uptime monitoring, and web traffic reporting. Major customers include AAA, Cisco, Kellogg’s, Telstra and the EPA. The company was ranked a “Leader” in the Forrester Wave IASM Q2 2019. It receives 4.3 out of 5 stars at the G2 review site.

Summary: The vendor focuses on a three-prong approach: see, contextualize and act. This includes powerful real-time discovery and contextualization capabilities. According to Forrester, ScienceLogic was the top-rated vendor in the intelligent application and service monitoring space for 2019. It noted that ScienceLogic is adept at “handling massive data aggregation and disparate architectures.” The vendor uses an algorithmic approach to build and search through a real-time data lake. This allows the platform to incorporate advanced automation, including run-book automation, predictive capacity allocation, and CMDB rationalization.

Splunk

Value Proposition: Splunk Enterprise collects, analyzes and acts on complex and disparate data generated by IT systems. Customers include Airbus, Dominos, Porsche and Cox Automotive. The vendor was ranked number one by Gartner in Market Share Analysis: ITOM, Performance Analysis Software, 2019 and earned 4.2 out of 5 starts at G2 user ratings.

Summary: Splunk uses machine learning, multi-site clustering and an open development platform to drive operational improvements within an organization. It boasts that it offers a data-to-everything platform designed to investigate, monitor, analyze and act. The framework ingests data from any structure, source and timescale, through AI and machine learning. It supports a broad range of users across the business as well as automated actions based on customized rules or AI-driven decision making. This promotes a framework with reduced IT complexity, 360-degree service visibility and preventative alerts with auto-remediation.

]]>