Data Management

Alibabacloud.com offers a wide variety of articles about data management, easily find your data management information here online.

A survey of data-centric intelligent city Research

A summary of the study of Intelligent city based on data Wang Jingyuan lie Xiang Danzhiguang in the urban informatization tide and the rise of data science, the intelligent city began to become a new concept and practice of urban development in the world. Data management, application and analysis technology, such as large data, data activation and data mining, play a key role in the construction of intelligent city. From the perspective of information science, this paper focuses on the topic of data-centered, and summarizes the latest developments of intelligent city research. This paper combs the urban data types widely used in the research of Intelligent city and its ...

Research on Library's mass data storage based on cloud computing

Research on the library mass data storage based on cloud computing Grong Fang Zhanglian data storage technology based on cloud computing changed the management mode of modern library, and also updated the traditional way of information service. The construction of modern digital library system will be set up along the development direction of network Information service industry, which has a great impact on the previous data storage mode. This paper studies the massive data storage technology in the cloud computing environment, analyzes the main features of modern digital library, and analyzes the application of cloud computing technology in library data management, and puts forward corresponding ...

Cloud Computing Overview

Cloud Computing Overview Daili The origin, concept and characteristics of cloud computing, and the difference between distributed computing and grid computing; This paper introduces the problems of cloud computing vendors, data storage, data management, programming mode and service types, etc. In particular, the risk types of cloud computing security and protection methods. The problem that cloud computing needs to solve is proposed. Cloud Computing Overview

Research on design and application of data service center oriented to large data

Design and application of data service center oriented to large data dollars Lili Wangsumei Wang Paiyi xu Ouyang Rongbin Long Neu Large data management challenges the traditional data management Service platform based on relational database management system, The research and development of data management and service centers that support the unified management of structured and unstructured data becomes a very urgent and important task, summarizing and analyzing four key technologies of large data service center, using the configurable generalization table model and so on, the prototype of Data Service center system based on Hadoop platform is designed.

Optimization of caching parameters for MySQL database performance optimization

In peacetime is asked the most questions about the MySQL database performance optimization issues, so recently intended to write a MySQL database performance optimization of the series of articles, I hope that the junior high level MySQL DBA and other interested in MySQL performance of friends to help. Databases are IO-intensive applications whose primary responsibility is data management and storage. And we know that the time to read a database from memory is the microsecond level, and reading an IO from a common hard disk is at the millisecond level, both ...

DataCleaner 2.5.1 Release Data quality analysis tools

DataCleaner is an Open-source data quality analysis tool for data analysis, validation, transformation, and similar ETL tasks. The tool can help you manage and monitor the quality of your data to ensure that your data is valid. It can be used for http://www.aliyun.com/zixun/aggregation/13442.html "> Master Data Management (MDM) methods, data storage projects, statistical research, data extraction, transformation and loading work, and more. Datacl ...

Application of MongoDB in call center system

The application of MongoDB in Call center system Shi, Gianschopo MongoDB is an extensible, high-performance, modeless, file-based data management system based on document storage. The feature of document-oriented storage makes the MongoDB can support the storage of loose structure data, the weak consistency feature makes the MONGODB can guarantee the faster user access speed, the high performance characteristic makes the MongoDB can better support the processing of large data quantity. The continuous development of the call center has put forward the new request to the data storage System: Large data quantity + loose data structure + high interview ...

Parsing SQL Azure How to extend a data platform into the cloud

Today, internet-based applications face many challenges. Users expect to be able to use any device to access data anytime, anywhere. However, the size of the data, the interactive format of the data, and the size of the user's access may change at any time. Developers must quickly build and deploy new applications to meet these evolving needs.   With traditional data management platforms, it is necessary to continuously invest in servers, operating systems, storage, and networks to meet the growth and change of these requirements. Database services in the cloud, such as Microsoft's SQL Azure, provide a new way to deal with ...

Trend Technology CEO: Hardware and encryption separation to achieve cloud security

Trend technology recently in the cloud computing security market and virtualization field of focus, chief executive and co-founder Chen Yihua reporters, now many people will have public cloud than private cloud security such questions, but in fact the public cloud can start from the management, so that the computer and data management encryption process completely separate,   Achieve the goal of safety efficiency. In view of the public cloud's information security problem, Chen Yihua said, in the entire cloud security category, the user most worried is in the public cloud above the data security question, whether the public cloud can guarantee the data only ...

Overview of Cloud Computing Technology

Overview of Cloud Computing Technology Jiangdeiyu introduces the working principle of cloud computing, the difference between grid computing, the characteristics of cloud computing and its application mode, and introduces the key technologies of cloud computing, including virtualization technology, data storage technology, data management technology, programming model technology and resource monitoring technology, and points out the main problems of cloud computing application Keywords-cloud computing; grid computing; virtualization; data storage; Data management [download Address]:http://bbs.chinacloud ...]

DataCleaner 2.3 Release Data quality analysis tools

DataCleaner is an Open-source data quality analysis tool for data analysis, validation, transformation, and similar ETL tasks. The tool can help you manage and monitor the quality of your data to ensure that your data is useful. It can be used for http://www.aliyun.com/zixun/aggregation/13442.html "> Master Data Management (MDM) methods, data storage projects, statistical research, data extraction, transformation and loading work, and more. Datacle ...

Pyrseas 0.4.0 Release relational database maintenance tools

Pyrseas is a tool that provides frameworks and utilities to upgrade and maintain http://www.aliyun.com/zixun/aggregation/22.html > relational databases. It uses a Dbtoyaml utility to create a yaml description of the PostgreSQL database table and uses the YAMLTODB utility to generate SQL statements to modify the input YAML specification of the database match. It is also used to develop data management tools to complement a flexible database ...

Common Data Format 3.4.0 publishes scientific information management tools

The National Spaces Science Data Center's (NSSDC) Common data Format (CDF) is a self-describing data image that is used to store and work with the standalone platform's cubes. It consists of scientific data management software packages (such as the CDF Library) that allow programmers and application developers to manage and process scalar, vector, and multidimensional data arrays. Common Data Format 3.4.0 This version adds HTTP://WW ...

Common Data Format 3.3.1.1 publish a scientific Management Pack

The National Space Science Data Center (NSSDC) Universal Data Format (CDF) is a self-describing data storage and multidimensional data manipulation platform and an independent discipline style. A Scientific data management package (called the "CDF Library") that allows programmers and http://www.aliyun.com/zixun/aggregation/13521.html "> Application developers to manage and manipulate scalar, vector, and multidimensional data arrays." Common Data Format 3.3 ...

Secrets of Germany's Championship: How to manage players with big data

The German team beat Argentina 1:0 in the early hours of the day, the 4th time in history to hold the World Cup trophy. June 21, Wetech has done a report to decrypt the German team how to use large data management team. Today, let's revisit this story and explore the secrets of Germany's championship. 21st century economic reporter Huang Shanghai reported that in the World Cup, the German team in the first round of the 4:0 victory in the possession of Ronaldo's Portuguese team, who would have thought that this is actually the power of large data? This June, the SAP Brazilian branch and the German Football Association in Germany's world ...

Digging bokhary data to create value in human resource management

In recent years, the term "big data" has been more and more mentioned, the use of commercial value around "big data" has gradually become the profit focus of the industry. Many businesses and individuals have begun to study how to turn "big data" into information assets that are more decision making, Insight discovery and process optimization. According to a survey by IBM in 2013, 28% of companies began experimenting with large numbers of data, and 47% of enterprises began to expand data-related activities. So, for the enterprise's human resource managers, what does the big data age mean for human resources management? ADP ...

When should a dedicated CEO take care of the company?

Summary: Editor's note: The author is Mark Suster, partner of GRP Partners, a renowned investment agency. Students who want to read more Mark Suster articles can click here. When should a dedicated CEO take care of the company? There are already too many people talking about the editor's note: The writer is Mark Suster, partner of GRP Partners at the prestigious investment agency. Students who want to read more Mark Suster articles can click here. When ...

How to help enterprises achieve the goal of sustainable development of information?

With the development of enterprise information construction, the demand for integration of enterprises gradually becomes urgent. The traditional way of integration is difficult for enterprises to achieve the goal of information integration, because they will bring some constraints on the long-term development of enterprises. UF UAP by building an integrated framework that meets the characteristics of the enterprise, to help enterprises achieve the goal of sustainable development of information. The traditional way of application integration has the following disadvantages: First, there is not a relatively independent organization for application integration. Application integration work stays in the software product implementation phase, the company on what software corresponding software vendors will do with some integration work; ...

Facebook solves the Achilles heel of Hadoop

The Hadoop tide is gradually sweeping across all of America's vertical industries, including finance, media, retailing, energy, and pharmaceuticals. Hadoop, while building up the concept of large data, also carries out real-time analysis of massive data, and finds the trend from the analysis to improve the profitability of the enterprise. As open source data management software, Apache Hadoop is primarily used to analyze a large number of structured and unstructured data in a distributed environment. Hadoop has been used in many popular ... including Yahoo,facebook,linkedin and ebay.

Shanghai launches large Data talent training program first Engineering master Enrollment June

June 12, the reporter learned from the Shanghai Science and Technology Commission, "Data science and large Data talent training program" has been officially launched, will be in the next 3 years to cultivate and introduce thousands of high-end data talent. The first batch of major data engineering students will be enrolled in June this year, September admission. The first batch of the training program launched a master of data engineering, data scientists training camp and data Science fist three projects, followed by 6 training projects such as youth data scientists, PhD in Data Science, master of Data Science, master of Data Science, and second degree in Data science. "Big ...

Total Pages: 4 1 2 3 4 Go to: Go

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.