Best title for these responsibilities

Charter / Data
Mæstro

Charter Data

PRE
T-Mobile
Mæstromore
May 5 11 Comments

40% of work is reverse engineering database structure all the way back to value input in a wide array of applications. Business rules, native application data types and validation and the entire methodology for field input. How the fields get copy pastad by users and changed through software stack, and advising best source for data points.

Other 40% is SQL development and delivering custom APIs to software development teams so they don't have to understand the application/ database 'context' to build software. Building new data structures, tables and engineering candidate keys to tie together applications that were not engineered with proper PK/FK relationships(little to no system to system automation)

Last 20% is prototyping new databases, structures, enriched tables and tableau reports that requesters didn't even think was possible.

Old cable company, super immature at using data for software, (field formats and preserving hierarchies matter) to the point of screaming about why 100 variations of a customer name obliterates any chance of customer profiling let alone automation and ML.

Current title - Network Engineer. HR has 0 idea what to do with me

What is this job in other companies or does it not exist? I have no idea what I should be getting paid or what title to ask for.

comments

Want to comment? LOG IN or SIGN UP
TOP 11 Comments
  • Credit Sesame an0nm00se
    Kind of. Learn python on your own outside of work with a bit of AWS and PySpark and you can up your TC greatly I imagine, probably with a better company too.
    May 5 6
    • Credit Sesame an0nm00se
      Good to get experience outside of work. I didn’t get exposed to aws in my last 2 jobs. I practiced outside of work. I typically suck up some of the free credits, but unless you are leaving clusters or instances on24/7, it’s surprisingly cheap if you are sure to monitor everything properly and not forget to turn off an instance or cluster.

      If you know python check out boto3, the python sdk for aws. Super awesome. If you are in data engineering, just focus on EC2, EMR, airflow (python library) for automated workflow/etl, and PySpark to be used on EMR. Eventually get to Kafka/kinesis and maybe flask api/microservices
      May 5
    • Charter / Data
      Mæstro

      Charter Data

      PRE
      T-Mobile
      Mæstromore
      OP
      That's really helpful, I definitely will.

      I'd really like to find some volunteer or part time data engineering work with a more mature team at some point. I learn exponentially faster when I can ask questions and use/ reference others code and adapt it for my use case.

      Most of the problem with dumpster diving through stack overflow for me is I spend so much time fitting the wrong pieces together that I loose interest in solving the problem
      May 5
    • Credit Sesame an0nm00se
      Highly suggest medium, lots of great articles with code alongs. And try to follow along coding yourself, save to GitHub. Definitely hard to teach yourself. Reading and practicing a lot is key. Luckily you are probably getting great SQL experience at work since you are using it extensively, so you have SQL down.

      One thing that helped me is trying to make an automated cloud ETL pipeline project using python, airflow, PySpark, and AWS/EMR(to run PySpark script). Basically just gets data from url/JSON, parses, does some calcs, push output to S3, then automate all steps with Airflow and put airflow scheduler EC2 instance and get it to run. I used boto3 to do a lot of the AWS Config and instance spin ups/down using boto3 instead of the bash or point and click on AWS platform.

      You will run into problems. This is good, this is where you will learn. I had a lot of hiccups during project. I learned a tremendous amount......and my code still needs to be refactored/reformatted tremendously.
      May 5
    • Charter / Data
      Mæstro

      Charter Data

      PRE
      T-Mobile
      Mæstromore
      OP
      That is an excellent use case, thank you so much for the tips, seriously
      May 5
    • Credit Sesame an0nm00se
      No problem, I was in a very similar situation. Always keep learning, it is true case of the tortoise against the hair. Medium was great for me because I could read on my phone quick little topic focused articles wherever I was (i.e. great on train).

      I may have some good resources to start out. Although i used boto3 it may be easier to start out using the point and click AWS platform and then SSH into EC2/EMR instances (you can just automate bash commands on Airflow that you use to submit spark jobs to EMR, etc.) then after get everything running come back to boto3 if you have time/are interested (It doesn’t seem to be a requirement for data engineering jobs)
      May 5
  • Facebook ⭕w⭕
    data engineer
    May 5 3
    • Charter / Data
      Mæstro

      Charter Data

      PRE
      T-Mobile
      Mæstromore
      OP
      Even though our tech stack is kiddie style? I cant get IT to install anything Java/ Python related so I have to do all cleans and transforms in SQL. It destroys my resume for other data engineering roles at more mature orgs to not have it in practical work experience :(
      May 5
    • Facebook ⭕w⭕
      Roles and tech stacks are mutually exclusive, a SWE who only knows jQuery is still a SWE...kind of.
      May 5
    • Charter / Data
      Mæstro

      Charter Data

      PRE
      T-Mobile
      Mæstromore
      OP
      Good point and lol
      May 5