Sign in
Ningxia Region | Beijing Region
Categories
Your Saved List Sell in Amazon Web Services Marketplace Marketplace Global Region Amazon Web Services Home Help

FastData Cloud Professional

FastData Cloud Professional

By: 西云数据 Latest Version: v1_0-fixed
Linux/Unix
Linux/Unix

This version has been removed and is no longer available to new customers.

Product Overview

Real-time lake warehouse: Adopts a next-generation storage and calculation separation architecture to provide elastic expansion and high concurrent processing capabilities. It is an enterprise-level flow, batch, integrated distributed database that enables seamless data flow between data warehouses and data lakes, a unified metadata system, and the ability to seamlessly connect to the big data ecosystem. It not only fully exploits the flexible characteristics of data lakes, but also maintains the growth and scale capabilities of digital warehouses.

Data modeling: supports logical model and physical model design to ensure that the modeling process is standardized and controllable; supports model reverse engineering to quickly manage enterprise inventory data models; integrates best practice methodologies from multiple industries to improve modeling efficiency; and supports related data standards.

Data integration: Based on CDC technology, an enterprise-level real-time data synchronization service that does not invade business systems is provided to ensure data timeliness and availability; based on CKP abnormal automatic storage technology under the WAL architecture, continuous transmission is achieved to ensure the stability of data transmission in the face of complex network conditions; plug-in capacity expansion, rapid iterative data integration capabilities and data source adaptation range are provided.

Data development: Supports visual offline/real-time task development for WEB SQL to reduce user learning costs; multiple task DAG organization forms enable cross-process and cross-project task dependency to easily support various business scenarios; rich big data components flexibly implement various tasks according to current resource conditions, with higher resource utilization; support data development and production environment isolation, multi-person collaborative development, which is safer and more efficient.

Data operation and maintenance: Data processing tasks are organized and monitored by DAG, and operations such as task repair, restart, suspension, and kill are more elegant; a complete alarm system supports custom alarm rules and rich log information to improve operation and maintenance efficiency; and supports computing engine switching.

Data services: Provide drag-and-drop workflow orchestration to implement complex API scenarios; unified enterprise data sharing services to strictly control data usage rights; monitor and analyze service usage from multiple perspectives to efficiently evaluate the value of data assets.

Data quality: covering the whole process of data capitalization to carry out quality supervision and inspection to ensure data integrity, validity, timeliness, consistency, accuracy and uniqueness; built-in quality inspection rule templates and support custom rules to make quality inspection scenarios richer; support the execution of quality checks in connection with ETL tasks to detect problem data in a timely manner and reduce data pollution.

Data security: Throughout the entire process of data capitalization, various data security management measures such as encryption, masking, and authority management of private data are provided to ensure the safe operation of data in all aspects.

Version

v1_0-fixed

Operating System

Linux/Unix, Amazon Linux 5.10.109-104.500.amzn2.x86_64 GNU/Linux

Delivery Methods

  • Amazon Machine Image

Pricing Information

Usage Information

Support Information

Customer Reviews