If you were watching automation of blue collar, retail and service jobs from the sidelines because your software engineering creativity is impossible to automate, you should be very very anxious. Here are two non-trivial code snippets from chatGPT *********************** Prompt: Write a CUDA kernel for batched matrix multiplication Response: __global__ void batch_matmul(float* A, float* B, float* C, int batch_size, int m, int n, int k) { // Compute the global index for the current thread int global_id = blockIdx.x * blockDim.x + threadIdx.x; // Check if the global index is within bounds if (global_id < batch_size) { // Compute the local indices for the current thread int batch_id = global_id / (m * n); int row_id = (global_id / n) % m; int col_id = global_id % n; // Initialize the dot product to zero float dot_product = 0.0; // Loop over the entries in the dot product for (int i = 0; i < k; i++) { // Compute the indices for the matrix entries int A_id = batch_id * m * k + row_id * k + i; int B_id = batch_id * k * n + i * n + col_id; // Accumulate the dot product dot_product += A[A_id] * B[B_id]; } // Store the result in the output matrix int C_id = batch_id * m * n + row_id * n + col_id; C[C_id] = dot_product; } } ****************** Prompt: There's a mxn grid (0-indexed), each cell containing a positive integer. You are allowed to navigate from a cell to any 4 adjacent cells (not diagonals). Write a C++ method that minimizes the sum of cells you need to remove, to block the path from (0, 0) to (m-1, n-1) Response: #include <vector> #include <algorithm> // Returns the minimum number of cells that need to be removed to block the path // from (0, 0) to (m-1, n-1) in the given grid. // // The grid is represented as a vector of vectors of integers, where grid[i][j] // is the value at the (i, j) cell. int minimumCellsToRemove(const vector<vector<int>>& grid) { // m and n are the dimensions of the grid. int m = grid.size(); int n = grid[0].size(); // dp[i][j] will be the minimum number of cells that need to be removed to // block the path from (0, 0) to (i, j) in the given grid. vector<vector<int>> dp(m, vector<int>(n)); // We initialize the first row and first column of the dp array. // For the first row, we can simply remove all the cells in the path. // For the first column, we need to remove the cells with the minimum value. for (int i = 0; i < m; i++) { dp[i][0] = i; } for (int j = 0; j < n; j++) { dp[0][j] = *min_element(grid[0].begin(), grid[0].begin() + j + 1); } // We fill the remaining cells in the dp array using the following // recursion: // // dp[i][j] = min(dp[i-1][j], dp[i][j-1], dp[i-1][j-1]) + grid[i][j] for (int i = 1; i < m; i++) { for (int j = 1; j < n; j++) { dp[i][j] = min(dp[i-1][j], dp[i][j-1], dp[i-1][j-1]) + grid[i][j]; } } // The minimum number of cells that need to be removed to block the path from // (0, 0) to (m-1, n-1) is stored in dp[m-1][n-1]. return dp[m-1][n-1]; } ************************* I cannot imagine how 10 years down the line we don't have full fledged Distributed Systems running generative models
Writing the code is the easy and boring part of the job, I hope we get there
As I said, we're already at writing code. 10 years into the future, it can do much much more
At the end of the day you are hired to problem solve. Coding is just a tool to accomplish that and LLMs or whatever we end up using once the code they produce is verifiably correct will just become another tool that will 10x the output of the people that embrace it. You shouldnât be afraid of technological advancement
I think it's already happening. Machines teach us chess, correct out grammar, recommend us what to watch and listen, it's gonna be more and more "scary'
This is literally so scary. One months ago I didnât even think this is possible in the next century. Also I tried to ask the same coding question for multiple times, and each time it generates a different solution with different comments and documentations which means that these solutions arenât hardcoded into its database
UBI baby! Everyone gets to coast
UBI will be like living off welfare.
cope, wont happen until all white collar has been automated, otherwise the elites will always have braindead simps akin to elon fanboys that will serve them as first line defense against legislation
Well it used a global and camel case so we're good for a while.
Imagine what they could do with something similar that was actually built specifically for coding. Gpt3 has no idea whether it's code is any good or not.
I will play devil's advocate here: Writing a single function is pretty elementary. Apply an algorithm and think of the tradeoffs (usually simplicity vs memory vs CPU). Outside of tech interviews where you don't go to stack overflow, it's not that difficult. Having a lot of functions that have to work in harmony is orders of magnitude harder... Especially when concurrency is added. Bonus: the hardest part of my job is dealing with people and actually deciding WTF the product will look like. Once we start implementing the code it's not rocket science. Generating multiple possibilities with AI and testing them would be a benefit Or we all lose our jobs
Itâs over. We had a good run as a society. Iâm going to buy a cabin and enough equipment that I can go off the grid when shit starts hitting the fan.
And then what
This is still standard Leetcode problem which isn't how you write enterprise software. Writing a function that does something is easy. Navigate a large React, springboot code base,add a feature. Fix a bug without human hunting the bug first....then we'll talk. This is just advance Googling.
I didnât think this sort of leetcode problem solving was possible , yet here it is . Who knows how much more advanced it will get
I think itâs exciting, not scary
Yep, I encountered a large language model. Killed my dog and stole my girl. Don't trust them.
Laugh all you want , 10 years later we will be slaves to these models
Idk anything about this AI thing besides a few posts on Blind, but it seems like it partially trained on a ton of LC questions. So none of us are losing our jobs, donât worry.