Vacancy expired!
- Understanding of Logical Data Model/Physical Data Model.
- Understanding of Normalization and De-normalization concept.
- Ability to write efficient SQL & exposure to query tuning.
- Should have good working knowledge on UNIX shell scripting and Python skills.
- Should have good understanding of data archive/restore policies.
- Should be good at automating various processes.
- Should be good at users/objects/rights management.
- Some experience in an environment where ETL/BI tools such as Informatica, DataStage, Business Objects, SAS, and MicroStrategy, Tableau, Power BI were an important part of the environment.
- Experience with Teradata in a cloud environment (public or private) a plus.
- System Performance monitoring using Teradata Viewpoint.
- Exposure to Teradata tools like Teradata Administrator/SQL Assistant /TSET etc.
- User, Objects and Security management as per Teradata Best practices.
- Hands-on experience in using DSA
- Exposure to other backup and restore process using Veritas netBackup(TARA)/netvault/Tivoli etc.
- Exposure to manage the Development/QA boxes.
- Exposure as an Development DBA.
- Well conversant with various ticketing system/production change request/ Teradata Incident management.
- Exposure to MVC, Capacity planning and performance Management.
- Hands-on experience in Statistics Management
- Exposure to troubleshoot the FastLoad / Multiloading/ FastExport/ BTEQ/ TPT and should be good at error handling.
- Exposure to TASM/TDWM.
- Exposure to DWH Env (Knowledge of ETL/DI/BI Reporting).
- Exposure to Teradata Utilities
- Coordinate Software/Hardware Upgrade Activities.
- Ensure Teradata Polices, Procedures, Customer SLA’s are honored.
- Exposure to PDCR Reporting and Analysis.
- Strong debugging skills