- Modern Data Architecture: Setting the strategic rollout, selection and delivery of proven as well as emerging tooling to ensure that data pipelines are scalable, repeatable and secure serving multiple users within the organization. The incumbent is responsible for implementing highly complex, multi-faceted big data initiatives associated the following functions areas:
- Translate complex functional and technical requirements into detailed design and high performing capabilities;
- Lead the design and build of data patterns and services - both batch, real-time and complex event handing - leveraging open technologies;
- Ensure the timely delivery to meet project timeline by automating development and deployment tasks;
- Effectively manage operational risk and change with a continuous focus on process improvements.
- Governance and Controls: Responsible for ensuring all data management processes conform to standard operating procedures and aligns with how a modern data architecture aligns to enterprise technology standards and policies; contribute and provide guidance around data quality, metadata management, data stewardship and security, and access controls.
- Innovation and Continuous Improvement: Evaluate/conduct PoC of new data technologies and capabilities that are value-driven; work closely with Manulife’s innovation teams (i.e., LOFT), remain current with industry advancements working closely external vendors as well connected to the Apache Foundation projects.
- Demonstrated 7-10 years professional experience in big data/data management including a university degree in Engineering, Computer Science or equivalent program;
- Extensive expertise in data technologies and the use of data to support software development, advanced analytics and reporting. Particular focus on Cloud (Azure) and Hadoop-based technologies and programming or scripting languages like Java, Scala, Linux, C++, PHP, Ruby, Python, R and SAS.
- Additional expert knowledge should be present regarding different (NoSQL or RDBMS) databases such as Hawq/HDB, MongoDB, Cassandra or Hbase.
- Working experience with modern data streaming process with Kafka, Apache Spark, Flink and data ingestion framework NiFi, Hive, Pig, etc.
- Experience with security/data protection solutions, Kerberos, Active Directory, HDFS Access Control, OAuth2, OpenID, LDAP
- Experience with network layer security design in VPN, firewall, Loadbalancer
- Experience and capability in translating non-technical user requests into complex technical specifications and solutions that meet these requirements;
- Excellent organizational and time management skills, with ability to multi-task. Ability to work with minimal or no supervision while performing duties.
- Own the ability and initiative to organize various functions necessary to accomplish department activities or goals and be a strong team player.