Sie sind auf Seite 1von 4

1.

What is the significance of a conceptual model?


Ans: It provides a global view of the entire database It is a representation of data as viewed by the entire organization It forms basis for identification and high-level description of main data objects, avoiding details Concepts are used to convey semantics during various natural languages based communication.

2.)

What do you mean by weak entity? Explain with an example


Ans: An entity set whose members owe their existence to some entity in a strong entity set. Entities are not of independent existence. Each weak entity is associated with some entity of the Owner entity set through a special relationship. Weak entity set may not have a key attribute.

Example: An order would not exist without a product and a person to create the order, so it could be argued that an order would be described as a weak entity and that products ordered would be a multivalued attribute of the order.

3.)Explain specialization and generalization in ER diagram.


Ans: Specialization: Process of maximizing differences between members of an entity by identifying their distinguishing characteristics. Specialization is the results of taking subsets of a higher level entity set to form a lower level entity sets. Specialization is a Top Down process where as Generalization is Bottom Up process.

Generalization: Process of minimizing differences between entities by identifying their common characteristics. Generalization and Specialization are important relationships that exist between a higher level entity set and one or lower level entity sets. Generalization is the result of taking the union of two or lower level entity sets to produce a higher level entity sets. In Generalization, each higher level entity must also be a lower level entity. In specialization, some higher level entities may not have lower-level entity sets at all. 4.) Differentiate between static and dynamic hash functions. Ans: Static hashing: In static hashing, the hash function maps search-key values to a fixed set of locations. Static Hashing has the number of primary pages in the directory fixed. Thus, when a bucket is full, we need an overflow bucket to store any additional records that hash to the full bucket. When searching for a record, the original bucket is accessed first, then the overflow buckets. Provided there are many keys that hash to the same bucket, locating a record may require accessing multiple pages on disk, which greatly degrades performance. Dynamic hashing: In dynamic hashing a hash table can grow to handle more items. The associated hash function must change as the table grows. The problem of lengthy searching of overflow buckets is solved by Dynamic Hashing. In Dynamic Hashing the size of the directory grows with the number of collisions to accommodate new records and avoid long overflow page chains. Extendible and Linear Hashing are two dynamic hashing techniques.

5.) Explain extendible hashing function in detail


Extendible Hashing is a mechanism for altering the size of the hash table to accommodate new entries when buckets overflow. Common strategy in internal hashing is to double the hash table and rehash each entry. However, this technique is slow, because writing all pages to disk is too expensive. Therefore, instead of doubling the whole hash table, we use a directory of pointers to buckets, and double the number of buckets by doubling the directory, splitting just the bucket that overflows. Since the directory is much smaller than the file, doubling it is much cheaper. Only one page of keys and pointers is split. This eliminates the need for overflow pages. It is basically it is a new access technique, in which the user is guaranteed no more than two page faults to locate the data associated with a given unique identifier, or key.

Das könnte Ihnen auch gefallen