High quality database has the power to improve your business dramatically by reducing the time required for data inputting and improving data quality by reducing mistakes. It also organises and presents data more effectively making essential business information more accessible for decision makers and users.

With your business requirements in mind, we audit your databases before providing expert advice on how to improve the efficiency of the database, saving you time and money. Having more than 30 years' experience in the database design and applying modern, integrated and dynamic database solutions we provide a high quality database delivery.

The main database service lines are:
– Investigation of existing customer database and its upgrade for any combination of issues detected in database model, data quality, database performance.
– New high quality database build from scratch.
– Database data feeding design, fixing, extension including ETL process, high volume data load/reload, applications data interfaces, data migration and consolidation.

To develop and support database data model we utilize our skills and experience in:

at the client choice which could be result of the enterprise standard or based on our recomendations.

In the database and ETL design we use advanced methods of relational/dimension modelling with normalisation/de-normalisation, "star"/"snowflake" schema design techniques based on Codd, Kimball, Inmon and other authorities' works and the best practical solutions.

Check logical database design and resolving the issues

– Re-engineering database design into a database design tools (if it's not used yet)
– Verifying of entities/relations of existing logical model
– Eliminating logical data redundancy
– Data normalising/de-normalising/dimensioning
– Check reference data for further consolidation and unification
– Naming convention implementation
– Implementing new updated logical data model
– Verifying new model against existing and potential data queries

Check data quality and data fixing

– Data dictionary creation/extension
– Verifying of data values using templates and data cleansing
– Creation/extension of database constraints to prevent the data errors
– Testing data consistency in the updated database and repairing found violations

Check application database performance

– Evaluation of the potential highest database performance
– Finding the performance break points
– Recommendation for the instance tuning

Database physical design refining

– Data handling acceleration (indexing/de-indexing, caching, data partitioning)
– Data storing re-design (schema/table splitting, partitioning, compressing)
– Sensitive data protection solution

Check and optimising database server procedure design and performance

– Analysis of working server procedures and procedure documenting
– Analysis of existing database data control and its adjustment and development where it's required
– Move SQL scripting logic to database procedure
– Re-writing/creation database procedures to refine data handling and quality

Logical database design

– Analysing data business requirements
– Data dictionary creation
– Data flow design
– Reference data consolidation and unification
– Naming convention solution
– Centralised data control design (referential integrity and check constraints)
– Logical database design development (OLTP, OLAP, IMS) with data model tools
– Verifying new logical data model with existing and potential data queries
– Logical data model documenting

Physical database design

– Effective data access/modification design (indexing, caching, data partitioning)
– Robust, fast and safe data storing solutions (schema/table splitting, partitioning, compressing)
– Sensitive data protection design

Implementing new data model in database

– Database deployment script generation and documenting
– Loading samples and the database quality verification
– Testing and polishing new database performance

Database server procedure design and development

– Implementation of business logic and data analysis through server procedures
– Server procedure architecture design
– PL/SQL code development, deploying and tuning

Legacy systems analysis for the data migration/ETL design purposes

– Legacy system data quality investigation and solutions for the found issues
– Hidden, non-documented legacy data dependency detection
– Re-engineering and documenting legacy data model

High volume data load solutions design and implementation

– Data partitioning design and implementation
– Measurement and load tuning
– Custom solutions for special cases
– Choosing the best tools for the specific high-volume data load

New ETL multi-level design

– Business requirement analysis and ETL strategic solutions
– Stage, foundation, and target data warehouse level design and implementation
– Overnight and near-real time data change capture solutions
– Integrated multi-source ETL development

Existing ETL re-design and adjustment

– Existing ETL analysis and the issue report
– Improvement of existing logical/physical data models
– Performance bottleneck detection and fixing
– Seamless implementation of new data sources in the ETL
– Documenting "non-documented" ETL

Data migration design and development

– Data migration strategy design and presentation
– Data migration using Oracle tools like ODI and OWB
– Bespoke data migration using metadata driven approach
– Data migration performance tuning