Intelligent database
Until the 1980s, databases were viewed as computer systems that stored record oriented and business type data such as manufacturing inventories, bank records, sales transactions, etc. A database system was not expected to merge numeric data with text, images, or multimedia information, nor was it expected to automatically notice patterns in the data it stored. In the late 1980s the concept of an intelligent database was put forward as a system that manages information (rather than data) in a way that appears natural to users and which goes beyond simple record keeping.
The term intelligent database was introduced in 1989 by the book “Intelligent Databases” by Kamran Parsaye, Mark Chignell, Setrag Khoshafian and Harry Wong. This concept postulated three levels of intelligence for such systems: 1. high level tools, 2. the user interface and 3. the database engine. The high level tools manage data quality and automatically discover relevant patterns in the data with a process called data mining. This layer often relies on the use of artificial intelligence techniques. The user interface uses hypermedia in a form that uniformly manages text, images and numeric data. The intelligent database engine supports the other two layers, often merging relational database techniques with object orientation.
In the twenty-first century, intelligent databases have now become widespread, e.g. hospital databases can now call up patient histories consisting of charts, text and x-ray images just with a few mouse clicks, and many corporate databases include decision support tools based on sales pattern analysis, etc.
External links
- The Book Intelligent Databases.
|