Get fresh updates from Hortonworks by email

Once a month, receive latest insights, trends, analytics, offering information and knowledge of the Big Data.


Comece a Usar


Pronto para começar?

Baixar sandbox

Como podemos ajudá-lo?

fecharBotão Fechar

Maximize the Value of Data-in-Motion
With Big Data from the Internet of Things

reproduzir vídeo botão de vídeo

nuvem Pronto para começar?

Baixar HDF
Hortonworks DataFlow (HDF™)

Hortonworks DataFlow (HDF)

Hortonworks DataFlow (HDF) provides the only end-to-end platform that collects, curates, analyzes and acts on data in real-time, on-premises or in the cloud, with a drag-and-drop visual interface. HDF is an integrated solution with Apache Nifi/MiNifi, Apache Kafka, Apache Storm and Druid.

The HDF streaming data analytics platform includes data Flow Management, Stream Processing, and Enterprise Services.

Powering the Future of Data

HDF Data-In-Motion Platform

Three Major Components of Hortonworks DataFlow


Easy, Secure, and Reliable Way to Manage Data Flow 

Collect and manipulate data flows securely and efficiently while giving real-time operational visibility, control, and management.


Immediate and Continuous Insights  

Build streaming analytics applications in minutes to capture perishable insights in real-time without writing a single line of code.

Saiba Mais

Corporate Governance, Security and Operations 

Manage the HDF and HDP ecosystem with comprehensive management panel for provisioning, monitoring, and governance.

Saiba Mais

Plataforma integrada de coleta independente de fonte de dados

HDF has full featured data collection capabilities that are streaming data agnostic and integrated with over 220 processors. Data can be collected from dynamic and distributed sources of differing formats, schemas, protocols, speeds and sizes and from types such as machines, geo location devices, click streams, files, social feeds, log files and videos.

Mais informações:

  • How real-time data-source agnostic dataflow management makes data movement easy
    Watch Video
    Learn More
    Learn what HDF can do to optimize log analytics from the Edge.Read More
Powerful Data Collection


With HDF, data collection is no longer a tedious process. You can manage data in full flight with a visual control panel to adjust sources, join and split streams, and prioritize data flow. HDF also can add contextual data to your streams for more complete analysis and insight. The always-on data provenance and audit trails provides security and governance compliance and troubleshooting as necessary in real-time. Integrated with Apache NiFi, MiNiFi, Kafka and Storm, HDF is ready for high volume event processing for immediate analysis and action. Kafka allows differing rates of data creation and delivery while Storm provides real-time streaming analytics and immediate insights at a massive scale.

Mais informações:

  • Como os dados em streaming gerenciados por meio de uma interface de usuário visual em tempo real do Apache NiFi aumentam a eficácia operacional.
    Assistir ao vídeo
Real-Time Data Flow Management


HDF secures end-to-end data flow and routing from source to destination with discrete user authorization and detailed, real-time visual chain of custody. Use the visual user interface of HDF to encrypt streaming data, route it to Kafka, configure buffers and manage congestion so that data can be dynamically prioritized and securely sent. HDF enables role-based data access that allows enterprises to dynamically and securely share select pieces of pertinent data. HDF can easily deploy flow management and streaming applications in a Kerberized environment without much operational overhead.

Mais informações:

  • See how granular access of data is better than role based access
    Watch Video
Enterprise-Grade Security


HDF includes a complete streaming analytics module, Streaming Analytics Manager (SAM), to build streaming analytics applications that do event correlation, context enrichment, complex pattern matching, analytical aggregations and create alerts/notifications when insights are discovered. SAM makes building streaming analytics easy for application deverlopers, DevOps and business analysts to build, develop, collaborate, analyze, deploy, manage applications in minutes without writing a single line of code. Analysts use pre-built charts to quickly build analysis and create dashboards, while DevOps can manage and monitor the applications performance right out of the box.

Mais informações:


HDF includes, Schema Registry, a central schema repository that allows analytics applications to flexibly interact with each other. This enables users to save, edit, or retrieve schemas for the data they need. This also allows easy attachment of schemas to each data without incurring additional overhead for greater operational efficiency. With schema version management, data consumers and data producers can evolve at different rates. And, through schema validation, data quality is greatly improved. A central schema registry also provides for greater governance of how data is used. Schema Registry is integrated with Apache Nifi and HDF Streaming Analytics Manager.

Mais informações:


Build Analytics Faster with Streaming Analytics Manager


Build analytics applications easily with drag and drop visual paradigm with drop down analytics functions


Analyze quickly with rich visual dashboard and an analytics engine powered by Druid


Operate efficiently with prebuilt monitoring dashboards of system metrics

Manage Data Flows More Easily with Schema Registry


Eliminate the need to code and attach schema to every piece of data and reduce operational overhead


Allow data consumers and producers to evolve at different rates with schema version management


Store schema for any type of entity or data store, not just Kafka

Guias de usuário do HDF

Obtenha as notas de lançamento do HDF; guias para usuários, desenvolvedores e noções básicas.


O melhor suporte da indústria para Apache NiFi, Kafka e Storm na empresa. Conecte-se à nossa equipe de especialistas e receba ajuda durante sua jornada.


Treinamento real ministrado por especialistas em Big Data. Disponível presencialmente ou on-demand sempre que você precisar de nós.