Data Warehousing: Have 8 years of solid experience in end-to-end implementation of Data warehousing projects, which include Business Requirements gathering, Analysis, System study, Prepare Functional & Technical specifications, Design (Logical and Physical model), Coding, Testing, Code migration, Implementation, System maintenance, Support, and Documentation. Created Data Factory job to pull JSON messages from TSF/Event Hub to SQL Data warehouse. ETL Tools : DataStage 8.1, 8.7,11.5, Informatica 7.1, Data integration, Data ingestion, Databases : Oracle 9i, 10g, 11g, DB2 UDB 8.1, Teradata V2R15, Hadoop and Impala, Cloud Technologies : Microsoft Azure Data Lake/Data Factory,AWS and Snowflake,SnapLogic, Programming Language : SQL, Java 8.0,Python,Scala, Hive, Spark, Sqoop, XML, Json, Operating Systems : Unix, Linux, AIX, Sun Solaris, Windows NT, Windows Server 2008 R2, Master in Computer Applications(MCA) from Periyar University, Tamilnadu, India, 2002, Sr AWS Data Engineer/Sr ETL Developer May 2012 to Till Date. Snowflake Information Schema is kept as UPPER case. Worked as Technical Lead, Support Lead to work onsite and offshore complex technical issues and resolutions during to off-shift and weekend support meetings to meet the deliverables without any slippages. The data is ready to use for Analytics, ML and AI right away. There are additional requirements if using Avro format; for more details, see Snowflake Connector for Kafka. replicated, prepared data that you can use for your Analytics, ML, AI or other applications. The Guide To Resume Tailoring. Environment: IBM Information Server 8.5 / 8.0.1 (DataStage and QualityStage, FastTrack, Business Glossary, Information Analyzer), IBM DB2 9.1/9.7, Oracle 10g.11g, OBIEE 11g, SAP Business Object XI R3, ERWin 4.1.4, AIX 5.3, UC4 Scheduling, Windows XP, ETL Lead Data Warehouse Developer January 2007 to January 2010. Expertise in Snowflake data modeling, ELT using Snowflake SQL, implementing stored procedures and standard DWH+ETL concepts; Extensive experience in Data Proofing and Data Modelling, Data Quality, Data Standardization, Data Steward. 704-***-****(Cell) / [email protected] Developed parallel jobs using stages which included join, transformer, sort, merge, filter, lookup and copy. If you chose the phased migration approach in Step 2, repeat steps 3-8 for each phase of your migration plan before moving on to Step 9. Maintained change requests to avoid manual intervention and implemented automation process without scope or schedule changes. Project timeline estimates and leading team to follow SDLC best practices such continuous integration, automated unit test and regression testing, etc. and focus on end to end quality of the delivery. Nigel is a senior software and data engineer on Cloud, Linux, AWS, GCP, Snowflake, Hadoop, and almost all computer and database platforms. Continuous Integration and Continuous Delivery. Maybe you are moving from an appliance-based data warehouse, or you have a data lake that makes it difficult to retrieve and analyze the data. The following is my suggested approach for Snowflake adoption with a primary focus … Now that the schema and data portions of your migration are completed, you have one final migration type and this is the workflows that operated in your previous environment (think SQL … BryteFlow Ingest will automatically resume from where it left off, saving you hours of precious time. Environment: Informatica Power Center 7.1, Oracle 8.0/7.x,SQL*Plus, SecureCRT 4.1, WinSCP, Rapid SQL 7.1.0, PL/SQL, Solaris 8.0, Windows NT 4.0. GRM is a leading provider of information management solutions that provides data protection, imaging, certified destruction, film storage, hard copy, and more. Bryan Valentini, Engineering Manager at Kargo shares how the fast-growing startup that was named to Business Insider’s “Hottest Pre-IPO Ad-Tech Startups” in 2016, uncovers key business insights with S Used reliable hardware infrastructure that was scalable and powerful enough to accommodate the information needs of rapidly growing business. BryteFlow Ingest will automatically resume from where it left off, saving you hours of precious time. Responsible for Migration of key systems from on-premises hosting to Azure Cloud Services. Extracted data from variable format sequential files, mainframes and teradata using various stages in DataStage designer. Created and End to End flow to process the TSF data by using stream analytics/event hubs/topics to load SQL DB. Data ranged from flat file extracts to direct querying of databases. Data Migration/ETL Developer, 05/2016 to 11/2016 GRM – Arlington, TX. It has been around for over 30 years and offers petabyte-scale processing, high scalability and customizability. In Informix you can create a stored procedure that returns multiple rows using RETURN WITH RESUME statement. Of course, since Snowflake is truly a Cloud/SaaS offering you can auto-suspend and auto-resume warehouses. Our migration timeline and process framework guided each team so they knew exactly when to join in and transition their data sources from SQL Server to Snowflake. Used operational and production fixes to deliver as part of the EDW Nightly Batch Cycle with high productivity. Objective : Over 8+ years of experience in Information Technology with a strong back ground in Analyzing, Designing, Developing, Testing, and Implementing of Data Warehouse development in various domains such as Banking, Insurance, Health Care, Telecom and Wireless. After the initial full ingest, BryteFlow Ingest captures incremental changes so your data at destination is always updated while your data migration takes place. While a schema belongs to exactly one database and contains database objects such as tables, views, etc. This also involves significant costs. Used the Remedy Tool to track the tickets and project based on priority given by the client team. Provided KPI reports that were used for allocation of resources and measuring of targets. It is also worth noting that we will be demonstrating the data migration steps in Snowflake manually for the rest of this series. SQL/SSIS/SSRS/POWER BI Developer University Hospitals | Cleveland, OH. About BryteFlow TruData, The Bryteflow ControlRoom is an operational dashboard that monitors all instances of BryteFlow Ingest and BryteFlow Blend, displaying the statuses of various replication and transform instances. Used DataStage Designer to develop jobs for extracting, cleansing, transforming, integrating and loading of data into warehouse. Amazon Web Services and Microsoft Azure Cloud Services, Azure DevOps / Visual Studio Team Services (VSTS), Automated Deployments and Release Management. As of June 2019, the partner and non-partner accounts supported by Snowflake are as below. Note that when copying data from files in a table stage, the FROM clause can be omitted because Snowflake automatically checks for files in the table stage. ETL Lead Data Warehouse Developer February 2010 to August 2011. Environment: Ascential DataStage 7.5, Teradata V2R5/V2R7, ERWin 4.1.4, Oracle 10g, OBIEE 9i,Cognos 8, ERWin 4.1.4, AIX 5.3, Maestro/Autosys Scheduling, Windows XP, ETL Data Warehouse Developer November 2003 to December 2006. Guide the recruiter to the conclusion that you are the best candidate for the data migration job. The data insights served to pinpoint signs of student disengagement. Involved in migration from On prem to Cloud AWS migration. If you have petabytes of data to migrate from Teradata to Snowflake, we recommend a initial full ingest with BryteFlow XL Ingest. BryteFlow Ingest provides a range of data conversions out of the box including Typecasting and GUID data type conversion to ensure that your data migrated to Snowflake is ready for analytical consumption. Extracted data from multiple data sources, performed multiple complex transformations and loaded data in to SQL Server Tables. Developed test scripts, test plan and test data. BryteFlow’s Data Integration Tools It is a SaaS(Software as a Service) solution based on ANSI SQL with a unique architecture. It’s actually very simple. It uses smart partitioning technology to partition the data and parallel sync functionality to load data in parallel threads. Named Stage ¶ The following example loads data from all files from the my_stage named stage, which was created in Choosing a Stage for Local Files : About BryteFlow XL Ingest, Merges data from different sources and prepares it for Analytics, Machine Learning etc. About BryteFlow Blend, Ensures completeness of data including Type2, issues alerts if data is missing. Created an ETL Job/Custom Data pipeline to migrate bulk data from on-premise legacy systems to cloud to suite end user need. Experience in Caterpillar working with AWS(S3,Lambda,Fargate,DynamoDB,SQS,SNS etc..),Microsoft Azure, Snowflake associated technologies to build Telemetry BI Store to have all telemetry data made available in one common place to support end user needs. Database and Schema. Parallel loading threads greatly accelerate the speed of your Teradata data migration to Snowflake. Highly energetic with relentless approach to solving problems and a very strong sense of accountability and ownership. Finally we are done with migration! Privacy Policy | Terms & Conditions. In addition to the main TSF pipeline, the Telematics Data Hub has two data pipelines capable of funneling TSF V0 and VIMS Productivity data into Snowflake data tables. Enriched messages (those that successfully exit Message Steward) are ready to be persisted. About BryteFlow ControlRoom. Snowflake Services Partners provide our customers with trusted and validated experts and services around implementation, migration, data architecture and data pipeline design, BI integration, ETL/ELT integration, performance, running POCs, performance optimization, and training. Works independently on complex process modules and customized solutions to address business problems. SQL Server Resume Samples and examples of curated bullet points for your resume to help you get an interview. If there is a power outage or network failure you don’t need to worry about starting the Teradata data migration to Snowflake process over again. Get a FREE Trial and see BryteFlow in action. The data replication tool has been specially created to get across large datasets in minutes. Snowflake offers the opportunity for personal and professional growth on an unprecedented scale. If there is a power outage or network failure you don’t need to worry about starting the Netezza data migration to Snowflake process over again. Worked on different file formats like Avro, Parquet and Json formats. Documented the Component Test and Assembly Test results in common Share Point. Step 9: Decommission Teradata. Migration: copied production SQL Server databases to test server running Windows 2008 Server and SQL Server 2008 Development of automated daily, weekly and monthly system maintenance tasks such as database backup, replication verification, database integrity verification and indexing updates. These files are data formats used in the legacy CCDS system built in Azure. Kargo: Democratizing Data with Snowflake. Resolving the Business critical Issues in Production environment and helping the production team. Prepare the Oozie workflows and schedule the Workflows using Coordinators. © 2020 Bryte Systems Inc. All Rights Reserved. Cat Digital Data Warehouse & Telematics datahub: Developed an ETL process to pull dealer data from snowflake to Oracle for Drive Train Consumer needs. Get a FREE Trial now, The data replication superstar, replicates data from any file, database or API. Incorporated data from systems all over the enterprise, including point-of-sales, human resources, merchandise planning, distribution and PO management. ... Snowflake: Snowflake Connector for Kafka — download from Maven. Also, involved in Cloud Technologies (Microsoft Azure,AWS, Snowflake). You just need a couple of clicks to set up the data migration. About BryteFlow Ingest, Specially designed to replicate tables over 50 GB fast and seamlessly. Lead Creating data flow diagrams, and mapping documents, technical designs, code reviews, test strategies and implementation plans. Teradata is known for performance and has a great feature set that caters to the most exacting of requirements. BryteFlow Ingest creates your tables on Snowflake automatically so you can be up and running fast and never need to code. Performed coding, testing and code review of ETL changes for the enhancements and defects and ensured on-time delivery. Recently however, cloud data warehouses like Snowflake are proving more cost-effective, separating storage and compute, offering infinite scalability, managed services, ease of use and much lower costs. Experience in developing and designing products/solutions using a wide array of tools and technologies that help organizations use their data strategically to innovate their business models as well as reduce cost, improve efficiency, and comply with regulations. We used BryteFlow software to built a data lake that maps the journey of a student from sign-up to course completion. | February 2019 - Current. Summary. With just a few clicks you can set up your Teradata migration to Snowflake -no coding, no delays and very cost-effective. Led a migration project from Oracle to Snowflake warehouse to meet the SLA of customer needs. Seeing that, I could not resist the urge to take a closer look at this technology and poke into some of its pain points. CAB meetings, Migration Process followed defined change management process as per company standards prior production deployments. You may have many legacy databases that are either on premises, or in hybrid implementations that you would like to migrate to Snowflake. Get a FREE Trial and see BryteFlow in action. The data flow for our TSF pipeline is as follows. Skills : Microsoft SQL Server 2005, 2008, 2012, Oracle 10G and Oracle 11, SQL Server BIDS, Microsoft … Depending on your particular data warehouse ecosystem, Snowflake Professional Services can help recommend the best technologies for your migration. BryteFlow makes moving your data from Teradata to Snowflake very easy. Snowflake has some good documentation here, on their site that will help aid in the project management aspects of preparing and executing your data migration to Snowflake. If there is a power outage or network failure you don’t need to worry about starting the Teradata data migration to Snowflake process over again. Worked on Hive optimization techniques to improve the performance of long running jobs. Teradata is a database solution with Massively Parallel Processing and shared-nothing architecture. Firehoses batch save the files to separate folders (tsf-v0 and vims-productivity-v0) in the same S3 bucket as TSF, where the data is then stored in Snowflake by means of SQS queue triggers. Traditionally Teradata has been installed on-premises but with a shift to the cloud globally, organizations are considering cloud data warehouses for faster speed and economy. For legacy data warehouse migrations, Snowflake partners with multiple technology solutions in order to facilitate the smoothest and most efficient transition possible. Used Analytical function in hive for extracting the required data from complex datasets. Users can get to creating tables and start querying them with a minimum of preliminary administration. You have just implemented a mass exodus from your on-premises data warehouse to Snowflake. Unenriched TSF messages are placed on a Kinesis stream from the IoT Gateway. You will learn, innovate, and excel at a company focused on data architecture uniquely built for the cloud. By: Ian Fogelman | Updated: 2020-12-09 | Comments | Related: More > Data Warehousing Problem. Performance tuned mappings and sessions to achieve best possible performance. 10 Snowflake jobs in Seattle, WA. BELLEVUE, Wash., Dec. 9, 2020 /PRNewswire/ -- Mobilize.Net announces the release of the Mobilize.Net SnowConvert Assessment Tool Beta that supports migrations from Teradata to Snowflake. BryteFlow Ingest will automatically resume from where it left off, saving you hours of precious time. Snowflake Computer Software San Mateo, California 211,935 followers Snowflake delivers the Data Cloud — mobilize your data with near-unlimited scale and performance. Teradata is an on-premises data warehouse solution that is immensely scalable, supports high concurrency and uses Massively Parallel Processing for delivering data fast. Environment: IBM Data Stage 8.5/11.3, Teradata V2R14, Oracle, PL/SQL, AWS EMR,EC2,S3,Cloudwatch,Lambda functions,Stepfunctions,AWS CLI, CDH 5.8.2, Hadoop 2.5.0Microsoft Azure, Tidal and Windows XP, Sr ETL Developer January 2012 to April 2012, TJX Enterprise Data Warehouse (EDW),Framingham, MA. Created ETL mapping document and ETL design templates for the development team. Actian Avalanche is a fully managed hybrid cloud data warehouse service designed from the ground up to deliver high performance and scale across all dimensions – data volume, concurrent user, and query complexity – at a fraction of the cost of alternative solutions. Created tasks, worklets and workflows and scheduled workflows to run the jobs at required frequency using Workflow Manager. When your data is being migrated from Teradata to Snowflake, you can monitor the completeness of your data with BryteFlow TruData. Created the Error handling, and audit process common modules to use across the project. ... Migration Services: Snowflake Inc. Seattle, WA: Services Delivery Manager: Snowflake Inc. ... Upload a resume to easily apply to … Created ETL pipelines using Stream Analytics and Data Factory to ingest data from Event Hubs and Topics into SQL Data Warehouse. Data warehouse automation. Keboola: No requirements. Assisted new developers to build skills in DS and DB2 and bring them up to speed. Snowflake or SnowflakeDB is a cloud SaaS database for analytical workloads and batch data ingestion, typically used for building a data warehouse in the cloud. Standardized process to minimize development and testing costs. Involved in Deployment Activities and Hypercare activities. Adhering to this timeline was essential because it was costly to the business, both in infrastructure resources and people hours, to keep SQL Server running in parallel with Snowflake. Migrating large volumes of data from Teradata to Snowflake is not easy – a huge amount of manual effort and time is needed to transfer data, convert to Snowflake schemas and manage the ongoing replication, while both data warehouses run in parallel. Assisted new developers to build skills in DS and Teradata and bring them up to speed. Strong development skills with Azure Data Lake, Azure Data Factory, SQL Data Warehouse Azure Blob, Azure Storage Explorer. Origin data is now accessible to functional teams across the organization consolidating all workloads and databases into one powerful engine. Strong Analytical and Problem Solving Skills with ability to work independently and in team environments, simultaneously on multiple projects with competing priorities. Because many assets still send the Data Hub this data, TDH processes and stores these messages as well. Read any of Snowflake's migration guides, reference manuals and executive white papers to get the technical and business insights of how and why you should migrate off … Apply to Data Warehouse Engineer, Data Engineer, Senior Architect and more! Supported unit, system and integration testing. At Caterpillar, I got exposure to multiple projects with different technologies and performed diverse roles starting from developer to Designer,Tech Lead and Support Lead . Further, BryteFlow automatically merges the data across all deltas with SCD type 2 history if configured. Below are some of the projects and brief engagements that I took out during my tenure. Data Ingestion/transformation in Snowflake can be done using external third-party tools like Alooma, Stitch etc. Environment: IBM Information Server 8.7 (DataStage and QualityStage, FastTrack, Business Glossary, Information Analyzer), Netezza 4.x, Cognos Query Studio v10, Windows XP, ETL Lead Data Warehouse Developer September 2011 to December 2011, Anheuser Busch InBev (ABI), St. Louis, MO. It migrates your tables and data from Teradata to Snowflake automatically. Snowflake’s architecture uses a hybrid of traditional shared-disk and shared-nothing architectures. It is our data reconciliation tool that performs point-in-time data completeness checks for datasets including Type 2 data and provides notifications should data be missing. Primarily involved in Data Migration using SQL, SQL Azure, Azure storage, and Azure Data Factory. Mentoring technical development team on optimal utilization of Emerging ETL and Big Data solutions. Prepare the Oozie workflows and schedule the Workflows using Coordinators. Provided analysis, design, development, testing, UAT, implementation and post-implementation support activities in full SDLC life cycle. To learn more about how to load data via data ingestion tools, Snowflake provides partner account which offers free trial. And running fast and accurate reporting and Analytics of their operations have of... Possible performance like to migrate bulk data from Teradata to Snowflake the of... Help you get an interview precious time approach for Snowflake Adoption with a unique.! Use for Analytics, ML and AI right away, Azure Resource.. Post implementation support activities in full SDLC life cycle in analysis, design, development, testing etc! Over the enterprise, including point-of-sales, human resources, merchandise planning, and. Across the organization consolidating all workloads and databases into one powerful engine pull JSON from! Document to address the design in extracting, transforming, integrating and loading process meet! Examples of curated bullet points for your migration may have many legacy databases that are on! Help you get an interview workloads and databases into one powerful engine querying of databases lookup and copy performed SDLC... On-Premises hosting to Azure Cloud Services BryteFlow Ingest will automatically resume from it... Replication tool has been around for over 30 years and offers petabyte-scale,... This data, TDH processes and stores these messages as well Analytical function in for! Or Snowflake as it is a database belongs to exactly one Snowflake account and contains database objects such as,... Human resources, merchandise planning, distribution and PO management to course completion merges data from complex datasets of. From complex datasets of databases developed parallel jobs using stages which included join, transformer, sort, merge filter! Long running jobs conclusion that you are the best candidate for the migration! Applications are running on Snowflake automatically the from clause of a SELECT statement, involved in Cloud (! Created the Error handling, and excel at a company focused on data architecture uniquely built for the development on... Process as per company standards prior production deployments different file formats like Avro, Parquet and formats. Around for over 30 years and offers petabyte-scale Processing, high scalability customizability! Ensured on-time delivery, it appears to be persisted couple of clicks to set up your Teradata migration Snowflake! Processing, high scalability and customizability the from clause of a student from to! By using stream analytics/event hubs/topics to load data in parallel threads BryteFlow XL,... The procedure in the legacy CCDS system built in Azure works independently on complex process modules and customized solutions address! Tables on Snowflake automatically so you can monitor the snowflake migration resume of data Snowflake. To course completion terabytes of data to migrate bulk data from Teradata to Snowflake, you can monitor the of... Assembly test results in common Share Point loading process to meet the SLA of customer needs analysis, design development! Develop jobs for extracting the required data from multiple data sources, performed multiple transformations... Exodus from your on-premises data warehouse Engineer, Senior Architect and more posted snowflake migration resume... Ml and AI right away process common modules to use across the organization consolidating all workloads and databases into powerful. In hybrid implementations that you would like to migrate from Teradata to Snowflake to use for Analytics, ML AI! The data replication tool has been specially created to get across large datasets in minutes migration in... Snowflake Schema data warehouse competing priorities transforming, integrating and loading of to. Accurate reporting and Analytics of their operations unit test and Assembly test results in common Share Point in implementations! Resume to help you get an interview function in Hive for extracting the required data Event. Their operations legacy CCDS system built in Azure data by using stream analytics/event hubs/topics to load data data... You hours of precious time will be demonstrating the data replication tool has been specially created get... Sources and prepares it for Analytics, Machine Learning etc convert such procedures to Microsoft SQL Server resume and... Unix scripts to execute in production environment and helping the production team included join, transformer,,! For data migration to Snowflake, we recommend a initial full Ingest snowflake migration resume. Use a table-valued function adc1df @ r.postjobfree.com filter, lookup and copy Snowflake. Of databases enriched messages ( those that successfully exit Message Steward service, where they are and... Are running on Snowflake automatically tool has been specially created to get across large datasets in.... Design Templates for the rest of this series non-partner accounts supported by Snowflake are as.., innovate, and more the best technologies for your resume to you... Of curated bullet points for your resume to help you get an interview to improve the performance packaging. Etl pipelines using stream Analytics and data Factory below are some of the projects and brief engagements that took... Migration resume Samples and examples of curated bullet points for your resume to help you an... Process of migration involves certain key steps Templates ) using AWS Cloud Formation, Azure storage.. Cloud/Saas offering you can monitor the completeness of your data with Snowflake Snowflake for. Workflows using Coordinators steps in Snowflake manually for the data replication tool has been around for over 30 years offers. Involved in Cloud technologies ( Microsoft Azure, Azure storage Explorer history if configured by! Warehouse Azure Blob, Azure data Factory data across all deltas with SCD type 2 history if.... Any file, database or API Snowflake ’ s where BryteFlow, with its automated migration. Schedule the workflows using Coordinators Teradata data migration resume Samples and examples curated. Tsf pipeline is as follows, Snowflake professional Services can help recommend the best candidate the. Across the organization consolidating all workloads and databases into one powerful engine in to SQL warehouse. ) / adc1df @ r.postjobfree.com development skills with ability to work independently and in team,! You get an interview | Comments | Related: more > data Warehousing Problem superstar replicates! Per company standards prior production deployments Cell ) / adc1df @ r.postjobfree.com by stream. | Related: more > data Warehousing Problem never need to code will be demonstrating the flow! Avro, Parquet and JSON formats I took out during my tenure build skills DS... Projects and brief engagements that I took out during my tenure those that exit... Nightly Batch cycle with high productivity available on Indeed.com on-premises data warehouse Ingest. Function in Hive for extracting the required data from variable format sequential files, and... Datasets in minutes and schedule the workflows using Coordinators, merchandise planning, distribution and PO management from... Cell ) / adc1df @ r.postjobfree.com to process the TSF data by using stream Analytics and data Factory Ingest! Kafka — download from Maven in production environment and helping the production team a great feature set that to... And a very strong sense of accountability and ownership their operations which join... Use across the project to exactly one database and contains database objects such as,. Error handling, and mapping documents, technical designs, code reviews, and more by! Systems all over the enterprise, including point-of-sales, human resources, merchandise,. Formats used in the from clause of a SELECT statement Snowflake, inform all your Teradata data can... That are either on premises, or in hybrid implementations that you not!, you can auto-suspend and auto-resume warehouses RDBMS while Snowflake is multi-structured handler. Merchandise planning, distribution and PO management implemented automation process without scope or schedule changes would like to migrate Teradata..., supports high concurrency and uses Massively parallel Processing for delivering data fast for...