I was playing around with copilot the other day, and prompted it to set up an API call to a popular analytics API. It set the call up (kind of) properly, but the call included a random API key (not mine) that threw an invalid URI error… which means the callback URI was wrong (obviously, not my key) but THE KEY WAS VALID! I suspect Copilot is grabbing code from all repos (private AND public), which means if developers don’t .git ignore their config values, there’s potentially a chance that copilot will scoop up their auth keys and broadly distribute them. I dunno if this is accurate… It’s hard to find specific information on how copilot is actually trained. Anyone else experience this/see any other funky behavior from copilot? Any #microsoft blinders have any thoughts on this? Am I way off base with my assumption? I’m going to be testing this a little more throughout the next couple weeks to see if I can prompt for any other auth keys/etc. #github #copilot
... Well if your not hiding your secrets that's embarrassing, but there's enterprise copilot which I believe is live already
TAGS approved at Dell? Or will that be in 5 years? 😂
Either lonestar or dsx platform will have it. It's been a topic in many of the data science council and dsx platform meetings. I believe the current methodology involves making a github account with your dell email and putting in a request for a license
It’s trained on public repos and resources, zero chance it was a private key from a private repo.
Ah so someone just accidentally dropped their API key on a public repo? That makes sense
That’s possible. Or it was a fake semi-valid key, sometimes you see sample keys on docs or online resources. Hard to say for sure!
From your description it sounds like it doesn’t create the security risk, but rather exposes the security risk the engineer created.
Hot take but: same thing
Agreed - I mean bad practice is bad practice, through and through… I would have thought, though, that copilot would be trained to sanitize any potential with keys/etc
It's kind of a security risk in that it promotes lazyness, and if there's a security risk in a public repo then there's the possibility copilot might introduce it into your code base if you're doing something similar Also I'm using the non enterprise version and I've turned off telemetry because it is concerning that I might hardcode a private key for testing, then this gets sent up to copilots server for autocomplete and I have no idea what happens to it then, with telemetry turned on will it get saved?
You are objectively wrong. You are also the sort of 'engineer' that will be fully replaced, as opposed to augmented with.
Objectively wrong about what?
My workplace blocked copilot for those reasons