User:KeyStroke/Subpage 1

From Wikipedia, the free encyclopedia

[edit] Career Summary

Data Analyst with extensive experience constructing both Logical data models and Physical data models, and databases, for On-Line Transaction Processing (OLTP) systems; and Online Analytical Processing (OLAP) systems (Data Marts and Data Warehouses), which, most recently, have been implemented in Oracle. Experience also includes being solely responsible for assuring data quality and data integrity across more than 90 distributed Oracle instances, and coding of an in-house built Data Movement (ETL) capability. Hands on experience also includes more than 6 years coding PL/SQL triggers, procedures, packages and functions in Oracle and optimization of SQL statements.

[edit] Experience

Transamerica Insurance, Kansas City, MO Data Analyst 8/2003-Present

Achieved acceptance of the use of client-server technologies to convert life insurance policy administration data from existing IMS database, based mainframe system to the latest available version of the corporate policy administration system. Among the technologies being used are: PowerCenter (by Informatica), Oracle 9i, stored PL/SQL procedures and functions, and SQL-Loader.

Instituted data quality profiling against data from existing policy administration system to bring greater visibility to data integrity problems and incompatibilities between the source and target policy administration systems. Technologies utilized were: DFPowerStudio from Dataflux and ad-hoc SQL statements in Oracle.

Provided project management functions for Data Administration group, including managing Data Management sub-tasks within project plan, establishing commitments to project schedule on behalf of Data Management and monitoring progress to schedule of activities of other members of Data Management, including: other Data Analysts, DBAs and Informatica PowerCenter consultants.

Evaluated responses to ‘Requests for Information’ on centralized Metadata Repository. Compiled evaluations to produce recommendation on acquisition of Metadata Repository.

Provided expert guidance on SQL optimization and PL/SQL stored procedures, triggers, and functions to development group implementing a new re-insurance administration system. Designed and Implemented a data-mart for the re-insurance administration system which formed the basis for display of data via the web to partner insurance companies.

Sprint PCS, Overland Park, KS Data Analyst III 10/2001-3/2003

Perform Project Lead functions. Provide initial estimates and estimate-to-complete.

Participate in multiple, concurrent, multi-week group data modeling efforts (for various development projects) resulting in both Logical and Physical data models, which were subsequently implemented in an Oracle database. Analyze business functions of the company to identify the data entities, relationships, and attributes required for automated systems. Interview Sources to determine Data Requirements. Particular attention provided to building an integrated, cohesive, single data model resolving the data integrity issues between three, previously implemented, applications.

Provide coaching and mentoring of junior Data Analysts in Data Modeling techniques.

Record logical data model in Power Designer (v.9.5) tool. Generate physical data model from logical. Prepare and publish Data Requirements Specification Documents. Generate DDL from Power Designer and prepare DDL for turnover to DBA.

Design, code, and unit test, PL/SQL functions, triggers, and stored procedures to maintain data integrity within and between systems (ETL). Coach and mentor developers in optimization of SQL statements. Investigate data quality using data mining methods and ad-hoc SQL.

Augment existing Data Mart structures with new data requirements. Provide consultation on most effective way to achieve data movement (ETL) between OLTP system and Data Mart.

Reverse Engineer physical data models from Oracle databases using Power Designer.

Open Systems Group, Dallas, TX Sr. IT Specialist 3/2000-9/2001

Client: Sprint Long Distance (May 2001 to Sep 2001)

Central Data Management Point-of-contact for the development of user-friendly front-end object oriented software, which allows translations and routing specifications to be populated into more than 150 telecommunications switches. This effort was undertaken as a joint effort between Sprint PCS and Sprint Long Distance. Provide project planning and resource coordination for Data Management personnel on project. Provide estimates and manage action items for Data Management involvement on project. Facilitate group data modeling sessions. Conduct Technical Quality Assurance reviews. Convert Logical Data model to physical data model, making necessary compromises to optimize for implementation into Oracle. Generate DDL. Collaborate with DBAs from Sprint PCS and Sprint LD to implement DDL into Oracle development environment.

Client: Bell Helicopter/Textron (May 2000 to Dec 2000)

Central contributor to the efforts at Bell Helicopter to synergistically integrate existing legacy data with commercially available, industry standard software utilizing a normalized Enterprise Database implemented on a client-server platform in an Oracle database. This effort allowed Bell Helicopter to provide greater visibility to their existing data resources while leveraging those data resources to reduce the ramp-up time for implementing commercially available off-the-shelf software (COTS). This effort also had the direct benefit of reducing Bell Helicopters dependence on out-of-date legacy systems and databases while increasing opportunity for data reuse and improving data quality. This effort has laid the foundation for future efforts at Bell Helicopter including e-Commerce and Data Warehousing.

Koch Industries Inc, Wichita, KS Sr. Data Analyst/Replication Administrator 2/1996-3/2000

Create a small, quickly developed, Data Mart in Oracle, which housed 4 fact tables accessed via 13 dimension tables and containing over one million rows in the largest fact table. (Case tool used: Power Designer).

Provide recommendations on data modeling techniques to the Information Warehouse team. Provide expert advice in SQL and PL/SQL tuning to Information Warehouse team.

Provide the data modeling, database design, production rollout planning, and implementation for data structure changes to an Oracle database in support of the OLTP system. This was undertaken in support of major version enhancement to the ERP system tailored for Koch use. (CASE tool used: ERWin)

Manage and maintain the daily replication process. This replication process maintains the data integrity between 77 geographically dispersed, yet structurally identical, Oracle databases throughout the 48 contiguous US states. Utilizing knowledge of SQL and PL/SQL coding techniques, redesign the procedural symmetric replication in response to changing data distribution needs in support of version 2.1 of Point.Man.

Provide SQL coding expertise to other developers on the OLTP project. Create PL-SQL triggers, stored procedures and packages used to enhance and support the ERP system. Determine the set of Oracle database instances needed to support the development and production needs of Koch Materials Co. Respond to various ad-hoc requests for information from the ERP system by constructing SQL-Plus scripts to create reports.

Sprint, Irving, TX Data Analyst IV 7/1989-2/1996

Along with the fundamental functions performed by a Data Analyst I distinguished myself by also performing these activities:

  • Perform Project Lead and Business Requirements Analyst functions.
  • Initiate and participate in an effort to create a company-wide standard data modeling technique.
  • Provided Facilitated leadership for group data modeling efforts.

Tools used: ENFIN, Bachman, ADW, IEW, IEF, IEX, and PROKIT

Boeing, Wichita, KS Data Management Technical Lead 10/1984-6/1989

Along with the fundamental functions performed by a Data Analyst I distinguished myself by also performing these activities:

  • Facilitate data modeling sessions in a multi-week group effort.
  • Manage data dictionary system and dictionary contents.
  • Prepare and publish Data Requirements Specification Documents.
  • Provide consulting to DBAs on the most appropriate compromises to data structures during database design
  • Conduct training sessions in the IDEF-1 Data Modeling technique
  • Participate in efforts to define and expand data modeling technique used by Air Force

Tools used: EXCELLERATOR