Skip toΒ content

dbt Contracts: Consumer User Experience Demo

12 mins

Where do I add feedback? Here:

View Comments and Reply


Show Transcript

Hey folks, this is sun speaking, and I'm gonna give you a demo of the work I've done so far when it comes to D B T contracts.

And so the purpose of this video isn't to give you, Hey, this is the full blown solution and implementation. This is concept art.

And so you're gonna notice a lot of Frankenstein code, but that's the point. My main goal is to collect feedback and incite a response around, Hey, what parts of this kind of role playing experience that sung is demoing a user experience that I really resonate with and could see myself enjoying versus like, Ooh, actually this is kind of ugly, or this doesn't make sense if I'm sitting in the mind and going through the behaviors of a D P T analytics engineer today.

And so I'm just gonna do some role playing just to go through just, Hey, on a surface level, what does this look and feel like?

And then two, I'm gonna peek under the hood of a code, just so you understand how the mechanics are working right now in a kind of simulated kind of way, versus like pseudo, Hey, this may be the actual implementation that we want.

All right. So with that in mind, I'm gonna put on my role playing hat, and this is an analytics engineer.

This is assuming I am just purely consuming an upstream contract. And so for example, I'm a finance only project and I am importing and consuming the D B T nodes and scope from an upstream core only project.

Okay. So with that context in mind, let's just dive straight into it. And so step one is going to make sure I have something called a D B T contracts, animal config, and I, as an analytics engineer, I'm gonna do some simple things similar to what I've been doing with packages, my entire career.

And so what that looks like is I'm gonna name it. Okay. And this has to be unique too, is I'm gonna call upon the contract version, similar to how we call upon the versions for packages.

I'm gonna point it to a contract location, and I'm going to provide some, uh, credentials. Okay. And then afterwards I'm gonna, we're gonna command that looks like this.

And when I run this, I'll explain what the logs are doing mechanically. And so step one is when I run, this is one, I understand the directory where it's located and the contracts consumed.

I probably not expose these in the logs for the, but for the purpose of this illustration, it's important to see the mechanics step by step in the logging.

Step two, I get, Hey, I wanna make sure my credentials are verified and yeah. Across both, um, contracts that I'm importing from, uh, similar to how we verify the connection when you run D B T D bug, uh, to verify your connection, uh, similar to, uh, let's see if I have an example up here, right?

Hey, everything good? All checks passed, right? Step two, if there's a contract version mismatch. So for example, I call upon, I want 1.0 a finance only, but for whatever reason, or I should say, um, what's it called 2.0 of core only, but it doesn't exist.

Um, it says not compatible. We're not gonna do anything with that core only contract the next step after that is to, um, as I look through this, I'm looking okay.

So I consume this contract makes sense. It's in my local DBT project directory. And then also here are the published nodes in the naming convention of the notes.

That's not too useful yet. Here are the private nodes. Okay. Makes sense. And then here is the test coverage expected verse actual.

Oh, okay. Invalid. Okay. Inva cool. Max upgrade time between versions. Okay. That's a good rule of thumbs, kind of serving me as like a, a cautionary tale of like, Hey, just be prepared when you're operating from one to two, this is your time window.

Step three is understanding, Hey, what permissions do I need? This is on the presumption that we're all asking, accessing the same database.

And, uh, just making sure behind the scenes that before I, I can actually select from the published nodes, uh, that I have these permissions in scope in order to make that happen.

And then some quick code snippets where I can go like, Hey, I wanna get productive right now. I don't want to have to scrounge around and reason about, Hey, what's the code.

I need to import this into a new file. I just go from here. And then I would just, you know, open up a new file over here.

I'll probably go here. I'll probably make a new file under my demo. Examples. Um, let's see. Consumer example SQL. And then from there I select all from, and then I'm good to go and I get to move on with my life.

All right. So that's the entire workflow. If I am a, uh, user going through this experience today, I'm I get these like kind of little validation, confidence boost every step of the way of like, okay, my credentials are verified.

Okay. I know, um, why something wasn't imported correctly and why? Right. Cause version mismatch. Totally understandable. Expectation three. I get some nice indentation of, Hey, just for the finance only stuff.

Here's all, here's essentially a report card of what I should care about. Right. And it's around, Hey, test coverage, things like that.

Um, and honestly, 90% of the time, I'm just gonna skim my eyes straight to the bottom portion of these logs.

Hey, select all from here and I move on from there. Okay. And so I'm gonna end role playing there now.

You're probably wondering, okay, what's going on mechanically sung to make the, these logs come to life, whether they be, you know, slapped together versus not first, let's start with the contract.

So overall I just made this up. Um, and what's really nice is when you use some Python libraries to load this in, um, it does so in a dictionary.

So it's pretty easy to work with from there. Okay. Another artifact to keep in mind is an example that looks like this.

Let's See, not this, This right now. This is just a placeholder artifact, but this is presuming that my production, my, my published nodes mechanic works really well within this contracting concept.

And I get this giant kind of contracts dot JSON right now. I am biasing towards housing, all the information related to catalog manifest, run results, source results in contract details, and just one giant JS O file.

Now, I'm sure there are a lot of spicy opinions about whether that's a good idea or not, but for the purpose of this illustration, it just made more sense to intuitive like, Hey, how is DPT contracts?

What, well, one consolidating all those information and serving it up to me. Oh, it's in a single file. So it's easy to kind of reason about how my logging is showing what it's showing.

And then from there, um, some key information is just the metadata. That's pretty par. For course, you get also get contracts.

And so this is kind of the meat and potatoes of the expectations versus the reality. Um, similar to kind of like a Terraform state file from there, it's a subset of my entire manifest.

So this will not include the all 30 plus nodes I have in my upstream project. This will only contain, uh, the nodes for, um, my first model, my second model as a generic example.

Um, and then we'll figure out a mechanism later to unite that with, you know, the consumer manifest and catalog JSONs and then from there, um, I get nodes catalog, and I just copied and pasted a bunch of subset snippets from the existing catalog, run results, sources, JS O all that fun stuff.

I'm gonna keep scrolling down here, nodes run results. Again, it's a simple subset source results. Okay. So those are the artifacts.

I'm gonna pause there and now I'm gonna move on to, okay. What's what code is actually running, uh, to make this possible.

Okay. This is where it gets a little filthy. I built a new sub parser command within main dot pie. I essentially, uh, made a duplicate of the DBT depths code and replicated that functionality here for this.

So let's say when you type DBT contracts, it works out of the box. Um, from there it's like, okay, so what is it actually calling upon?

It's calling upon this. Now, this is quite a bit messy. So bear with me for a little bit Overall, The thing you should care about, you can ignore all this.

The thing you care about is starting at line 63, I'm running code. And from here, here are the kind of conceptual steps I need to do.

Here are some placeholder steps. So keep in mind, this is all smoke and mirrors, but that's the point. I just wanna go through the mechanisms like, oh, does this user experience make sense to me?

Right. But it's helpful to understand how I'm creating the smoke and mirrors in the first place. So step one, I locate the DPT contracts file.

Um, right now I am just doing this. Um, so this is not kind of an implementation. I recommend this is what just got done in the short term.

And so I'm just building from my current working directory, which is in my, um, get ignored DBT environment. I look for this file and I go from there next step.

I open up that file and I render it. So I can consume that as a dictionary and kind of have Python, uh, work a lot more, uh, intimately with it from there.

I do some quick contract validation just to make sure connections are okay. And so from here, I think it's helpful to side by side within the contracts.

It just takes these credentials. I probably use environment variables, um, to authenticate to this S three bucket, probably use the Botto three library, um, that AWS has to make that happen.

I probably have to create a menu option of like, for things like GCs, API keys, things similar to like how Terraform, um, enables different remote backends today to store state files, we'd have to do the same for contracts dot JSA.

The next step is just verifying that the credentials work. And then right now this is all just dummy code, but it's just cool to say, Hey, this is how I get a little confidence boost that, um, D contracts is working as expected from there.

I make a new directory. And so I'll even show you an example of that. That would probably have been in, uh, right here D BT contracts.

Okay. You notice here, it just copies over that dummy contract J on file that I showed you earlier. Okay. The next step after this is I'm creating this dummy contract file allocation in practice.

This would be the actual S3 bucket that I can figured over here, but this is what made sense. The short term, okay.

From there, I loop through each of the contracts in scope. I open them up, load them in as a dictionary for Python, and then just print a bunch of logs by, um, calling upon the key value pairs to get the information I care about.

And once I get all this information, I print it out in an ergonomic way. And so that I, as the analytics engineer, just get to skim it and go like, do I see the green check marks I want?

Yep. Ooh. I say everybody check mark. Okay. I can ignore that for now. Oh, okay. I can just sell like, from this and then input it in a brand new sequel file and move on with my life.

Okay. That is it. All right. Um, I am not deeply emotionally attached to this workflow, but I think it is helpful to understand when I run a command like D B T contracts.

And I look at a config like this, what user experience enables me to feel confident that I'm getting the report card I want, so I can confidently, and effortly effortlessly copy and paste this and know it's gonna work as expected.

Um, all right. Well, that's about it. See, you.


More than 21 million people across 200,000 companies choose Loom

My teammates and I love using Loom! It has saved us hundreds of hours by creating informative video tutorials instead of long emails or 1-on-1 trainings with customers.
Erica Goodell

Erica GoodellCustomer Success, Pearson

Loom creates an ongoing visual and audible experience across our business and enables our employees to feel part of a unified culture and company.
Tyson Quick

Tyson QuickCEO, Postclick

My new daily email habit. Begin writing an email. Get to the second paragraph and think 'what a time suck.' Record a Loom instead. Feel like 😎.
Kieran Flanagan

Kieran FlanaganVP of Marketing, HubSpot

Loom amplifies my communication with the team like nothing else has. It's a communication tool that should be in every executive's toolbox.
David Okuinev

David OkuinevCo-CEO, Typeform

My teammates and I love using Loom! It has saved us hundreds of hours by creating informative video tutorials instead of long emails or 1-on-1 trainings with customers.
Erica Goodell

Erica GoodellCustomer Success, Pearson

Loom creates an ongoing visual and audible experience across our business and enables our employees to feel part of a unified culture and company.
Tyson Quick

Tyson QuickCEO, Postclick