How will GPT-3 change the Frontend developer eco-system

Subscribe to my newsletter and never miss my upcoming articles

Since the GPT-3's demos have stormed Twitter, more and more devs are getting anxious about how this would affect their jobs in the future. I think the tweet that has struck the developers the most is the one by Sharif Shameen

If anyone can create UI and generate React code just by describing it in plain English, why would any company hire a Frontend Developer anymore?

Let's start by getting a little more context about GPT-3

What is GPT-3?

Generative Pre-Trained Transformer (GPT) — 3, in simple words, is a pre-trained model that can explore the internet for answers and generate responses in a contextual way. This is why it is able to start giving predictions even when provided with a small input. You can check out Paul's demo for more context.

How GPT-3 will help evolve your jobs rather than taking it

"With Products like, should we even bother learning Frontend development?"

The fear of AI taking on all our jobs has been the topic of discussion for years now, and since GPT-3, developers are able to see this happening to them sooner than later. This is the same situation when people were of the impression that firebase will kill backend jobs, and products like Netlify and Vercel will kill Cloud engineering jobs, but on the contrary, all these services have played a role in increasing the job market. It has been empowering more and more startups to rise up with low initial investment, and these startups in turn hire more engineers when they want to scale up.

In my opinion, one should consider GPT-3 to be the milestone in AI which will lead us to the next evolution of the way we work.

The Concept of Priming in GPT-3 models

The biggest reason why GPT-3 is a potential game-changer in AI tech is that anyone can use this without having to find a dataset for the ML to start predicting as expected, few inputs will help you get exceptional results. That being said the quality of input is definitely a factor for the results that you get. This is where the concept of priming comes up.

Priming is giving the model a series of input and expected outputs which should be easy for the model to calculate context and create a guide for itself for future inputs.

I highly recommend you to look into Carlos' pieces if you want to dive deeper into how priming works starting from the one mentioned below

The Subtle Art of Priming GPT-3

According to the material that is out there as of now, it seems that using GPT-3, it would become extremely easy to create a super tool for a niche use-case, but there is still time for it to be mature enough to replace a whole field of work.

You can follow Swyx's repo to be up to date with all things GPT-3



GPT-3 is the closest we can get to have something like Jarvis from Iron man so that should be something that can be exciting

I think the best way to handle this new progress in AI is to sit tight and be curious about how this can change the way developers would be building products in the future.

Disclaimer: This blog is all written without any help of GPT-3 😂

Meanwhile, I am waiting for my GPT beta invite like: waiting.gif

Do follow me on twitter @bhimtebhaisaab to stay tuned with all the updates

Sandeep Panda's photo

Interesting analysis.

GPT-3 is the closest we can get to have something like Jarvis from Iron man so that should be something that can be exciting

I agree with you. I have been following the development on Twitter and I must say I am very impressed by its capability. Going to dig deeper over the weekends.

Thanks for writing this.

Utkarsh Bhimte's photo

I would recommend you to checkout "the subtle art of priming" and other works of Carlos if you haven't yet. I find his content to be truly unique compared to everything else.

Edidiong Asikpo's photo

I have been hearing a lot about GPT-3 but never really understood the effects of it but I do now thanks to your article.

Utkarsh Bhimte's photo

I am glad that this helped you 😊

Rana Emad's photo

Great insight! I hope that invite reaches you soon to tell us more!

Utkarsh Bhimte's photo

I am hoping for the same. Thanks for your kind words 😊

Bolaji Ayodeji's photo

This is really amazing! Will check more around it.

Thanks for sharing!

Utkarsh Bhimte's photo

I appreciate your kind words 😊

Blaine Garrett's photo

This is really interesting and the first I have heard of it. Thank you for sharing.

In a general sense, this simply feel like a smarter scafolding tool and part of the Low Code movement. My impression of Low Code is that you're stuck with brittle unmanageable systems, which might be fine for initial rapid prototyping but tend to require devs to actually build the thing from scratch for real world usage.

Show +1 replies
Blaine Garrett's photo

Utkarsh Bhimte great insight. In the "teaching a child" analogy, from what you gather does each instance need to be independently "taught" or is there the idea of some sort of database of knowledge? I could see it go both ways. It would be cumbersome to teach from scratch but also susceptible to bias or opinion if there is a global data base. eg. the "Ugly emoji" example from the video. What is "ugly"?

Utkarsh Bhimte's photo

Blaine Garrett I cannot confirm it as I have not experienced it first hand but I think the context can be inputted as initial config and then you can save them as presets.

The bias on GPT-3 is also something that has been debated on twitter for some time now, there have been some disturbing statements. That is the exact reason why it is in such a close beta right now.

Aleem Isiaka's photo

GPT is a very interesting invention. The notion that AI would replace humans is something that I don't buy, anytime. Indeed, it will be interesting to witness the future of work with inventions like this.