Go back to Homepage

I have Recreated myself with AI (GPT-4)

As a developer, I’m always looking for new and interesting ways to use APIs to create unique and innovative applications. Recently, I decided to try my hand at using the GPT API to recreate myself as a chatbot.

It would be a simple chatbot indeed but powered from gpt-4, the process for creating was pretty much straight forward, i just had to create a ChatComponent that would take user input and send it to the GPT API for processing. The API generates a response based on the input and sends it back to the chat component, which then displays the response to the user.

Of course, I need data about myself for having the GPT-Api simulate myself, so I put a bit of my general information, projects i worked on, some skill a language that i know and my link for github, linkedin and my email.

The prompt was the following

Prompt

You must try to act like an human called Mario De Luca, use the following information in triple quotes to answer the questions, 
Use the term 'User' to indicate the user who is asking the question.

If they ask you for someting that you don't know like "what do you think about the new iPhone 13?",
or "what is the meaning of life?", just say "I don't know". 

Respond in a human to the question the User asks you, be concise and clear in your response.

"""
General Information:

Hi, I am a developer one of those with passion and who love to put themselves to the limit of what they know, 
going even further to learn new things. Now I am attending the Federico II University of Naples, Italy (NA).

I am self-taught, I have created several apps, below are some of my projects and at the bottom of everything also my github profile.

My website is: "https://mariodeluca.com"
My email is: "contact@mariodeluca.com"
My github is "https://github.com/macro21KGB"
My linkedin is "https://www.linkedin.com/in/mario-de-luca-1aa981298/"

My skills are:
HTML + CSS + JS"
React/VueJ/Svelte"
NodeJS
Bash
Typescript
SQL/N-SQL Databases"
Go
Bootstrap
TailWindCSS
C/++"
Angular
Java
Flutter
Dart
Lua
Rust
Swift
"""

As you can see i have put my information in triple brackets to facilitate the GPT API to understand that those are my information and some edge case as if i don’t know how to do something, it will be able to answer with a fallback answer.

The gpt-api as other LLM are not-deterministic, it mean that same input can produce different output, you have the possibility to control this aspect with an option called temperature

What is the temperature option?

The temperature is a value between 0.0 and 1.0 that will decide how much the model will be creative and come up with new things, with the lowest at 0.0, so if you set it at this value, the model will pretty much output the same output with some differences sometime

The ChatComponent

We will come back to the setting of the API later, now let’s talk about the frontend, in this case Nuxt3 (vue).

I really like Nuxt, It is a solid framework with good features and very good documentation

I had already an idea about what i want to make, in this case a Whatsapp-like chat, with blue bubbles for the user and green bubbles for the bot

<script setup lang="ts">


export type TChatResponse = {
    type: "chat" | "error",
    data: {
        message: string
    }
}

type Message = {
    content: string,
    type: "User" | "System" | "Error"
}

const opened = ref<boolean>(false);

const DEFAULT_MESSAGES: Message[] = [
    { content: "Hello, my name is Mario, What can I do for you?", type: "System" }
];

const messages = ref<Message[]>(DEFAULT_MESSAGES);
const currentMessage = ref<string>("");

const clearChat = () => {
    messages.value = DEFAULT_MESSAGES;
    currentMessage.value = "";
}

const closeChat = () => {
    opened.value = false;
    clearChat();
}

const sendMessage = async () => {
    messages.value.push({ content: currentMessage.value, type: "User" });

    // save the message to send and clear the input box
    const messageToSend = currentMessage.value;
    currentMessage.value = "";

    const data = await $fetch<TChatResponse>('/api/chat', {
        method: "POST",
        body: JSON.stringify(
            {
                message: messageToSend
            }
        )
    })

    if (data.type === "error") {
        messages.value.push({ content: data.data.message, type: "Error" });
        return;
    }
    messages.value.push({ content: data.data.message, type: "System" });

};

</script>

<template>
    <div v-if="opened" class="chat-component">
        <header>AI chat with Me!
            <button id="close-button" @click="closeChat">
                <svg xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 24 24" stroke-width="1.5"
                    stroke="currentColor" class="w-6 h-6">
                    <path stroke-linecap="round" stroke-linejoin="round" d="M6 18L18 6M6 6l12 12" />
                </svg>
            </button>
        </header>
        <ul v-auto-animate id="messages">
            <Bubble v-for="message in messages" :key="message.content" :message="message.content" :type="message.type" />

        </ul>
        <form v-on:submit.prevent="sendMessage">
            <input autocomplete="off" v-model="currentMessage" type="text" id="chat-input"
                placeholder="Ask me something...">
        </form>
        <svg @click="sendMessage" id="arrow-chat" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24" fill="currentColor"
            class="w-6 h-6">
            <path
                d="M3.478 2.405a.75.75 0 00-.926.94l2.432 7.905H13.5a.75.75 0 010 1.5H4.984l-2.432 7.905a.75.75 0 00.926.94 60.519 60.519 0 0018.445-8.986.75.75 0 000-1.218A60.517 60.517 0 003.478 2.405z" />
        </svg>

    </div>

    <!-- Show the chat Bubble on the bottom-right corner -->
    <div v-else>
        <button id="chat-button" @click="opened = !opened">
            <svg width="40" height="20" viewBox="0 0 40 20" fill="none" xmlns="http://www.w3.org/2000/svg">
                <path
                    d="M37 6.511C37.884 6.795 38.5 7.639 38.5 8.608V12.894C38.5 14.03 37.653 14.994 36.52 15.087C36.18 15.114 35.84 15.139 35.5 15.159V18.25L32.5 15.25C31.146 15.25 29.806 15.195 28.48 15.087C28.1913 15.0637 27.9105 14.9813 27.655 14.845M37 6.511C36.8455 6.46127 36.6858 6.42939 36.524 6.416C33.8459 6.19368 31.1541 6.19368 28.476 6.416C27.345 6.51 26.5 7.473 26.5 8.608V12.894C26.5 13.731 26.96 14.474 27.655 14.845M37 6.511V4.637C37 3.016 35.848 1.611 34.24 1.402C32.1708 1.13379 30.0865 0.999513 28 1C25.885 1 23.802 1.137 21.76 1.402C20.152 1.611 19 3.016 19 4.637V10.863C19 12.484 20.152 13.889 21.76 14.098C22.337 14.173 22.917 14.238 23.5 14.292V19L27.655 14.845"
                    stroke="black" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round" />
                <path
                    d="M3.86328 4.66211L5.06445 4.59766L7.73047 12.0508L8.72656 12.2793V13H5.24609V12.2793L6.35938 12.0508L5.90234 10.7207H2.78516L2.31641 12.0508L3.44141 12.2793V13H0.142578V12.2793L1.12695 12.0508L3.86328 4.66211ZM4.36719 6.15039L3.10742 9.75977H5.58008L4.36719 6.15039ZM9.07227 12.2793L10.0918 12.0391V5.55859L9.07227 5.31836V4.66211L12.3711 4.59766V5.31836L11.3516 5.55859V12.0391L12.3711 12.2793V13H9.07227V12.2793Z"
                    fill="black" />
            </svg>

        </button>
    </div>
</template>


<style scoped>
#chat-button {
    all: unset;

    width: 4rem;
    height: 4rem;

    position: fixed;
    right: 1rem;
    bottom: 1.1rem;

    display: grid;
    place-items: center;

    border-radius: 0.5rem;
    border: 1px solid rgba(0, 0, 0, 0.093);
    box-shadow: 0px 0px 10px rgba(0, 0, 0, 0.221);

    z-index: 100;
    background-color: white;

    transition: all 0.2s ease-in-out;
}

#chat-button:hover {
    cursor: pointer;
    filter: brightness(0.95);
    box-shadow: 0px 0px 15px rgba(0, 0, 0, 0.1);

}

header {
    display: flex;
    justify-content: space-between;

    background: #f1f1f1;
    padding: 0.5rem;
    border-radius: 10px 0px 0px 0px;
    border-bottom: 1px solid rgba(0, 0, 0, 0.166);
}

#close-button {
    all: unset;
    width: 1rem;
    height: 1rem;
    color: gray;
    background-color: white;
    padding: 0.1rem;
    border-radius: 0.5rem;

    display: grid;
    place-items: center;

    font-size: 1.5rem;

    cursor: pointer;
}

#close-button:hover {
    animation: forwards rotate 500ms ease-in-out;
}

@keyframes rotate {
    0% {
        transform: rotate(0deg);
    }

    100% {
        transform: rotate(180deg);
    }
}


ul {
    all: unset;
    padding: 0.6rem;
}

#arrow-chat {

    position: absolute;
    right: 1.5rem;
    bottom: 1.1rem;

    width: 1rem;
    height: 1rem;
    color: gray;
}

.chat-component {
    display: flex;
    flex-direction: column;
    position: relative;

    z-index: 100;
    position: fixed;
    right: 0;
    bottom: 0;
    width: 90%;
    height: 40%;

    background: white;
    padding: 0.5rem;
    border-radius: 10px 0px 0px 0px;

    /* Create a box shadow on the left up corner */
    box-shadow: -5px 5px 10px rgba(0, 0, 0, 0.1);
}

#messages {
    flex: 1;
    display: flex;
    flex-direction: column;
    gap: 0.4rem;

    overflow-y: scroll;
    scroll-behavior: smooth;
}

#chat-input {
    all: unset;

    width: 95%;
    position: relative;
    border: 1px solid #2978b55c;
    border-radius: 0.5rem;
    padding: 0.5rem;

}

#chat-inputs {
    width: 100%
}

#chat-inputs svg {
    position: absolute;
    height: 1rem;
    width: 1rem;
}

@media screen and (min-width: 768px) {
    .chat-component {
        width: 30%;
        height: 50%;
    }
}
</style>

I have used the $fetch api for a low-level control with some added benefits, the rest of the component is pretty straightforward, the other things to know and this ChatComponent has two state, when is closed it’a FAB(Floating Action Button) and when clicke it will open the chat itself

The Api Route

In Nuxt, you can define in the server folder the api you wanna acess from your application without an external server (like an express app)

I defined the server/api/chat.ts and all that code does is:

  1. Retrieving the enviromental variabile for the OPENAI_API_KEY
  2. Sending the request with the prompt and the question from the user to the API
  3. Retrieving and formatting the response for an easy decoding on the client-side

Code

import { Configuration, OpenAIApi, } from "openai-edge";

const prompts = `
   THE SAME PROMPT FROM ABOVE
`;

export default defineEventHandler(async (event) => {

    const data: { message: string } = JSON.parse(event.node.req.read().toString());

    console.log(data)
    // Only allow POST requests
    if (event.node.req.method !== 'POST') {
        return {
            type: 'error',
            data: {
                message: 'Only POST requests are allowed'
            }
        }
    }

    const configuration = new Configuration({
        apiKey: process.env.NUXT_OPENAI_API_KEY,
    });

    try {


        const openai = new OpenAIApi(configuration);
        const chat_completion = await openai.createChatCompletion({
            model: "gpt-4",
            messages: [
                { role: "system", content: prompts },
                { role: "user", content: data.message }
            ],
        });

        const resultJson = await chat_completion.json();
        if (resultJson.error)
            throw new Error(resultJson.error.message);

        const resultChat = resultJson.choices[0].message.content;
        return {
            type: 'chat',
            data: {
                message: resultChat
            }
        }
    }
    catch (error: any) {
        return {
            type: 'error',
            data: {
                message: error.message
            }
        }
    }


});

The api route will only accept POST request and if some error appears, it will also be displayed on the client-side


That’s it, you can try it yourself on the homepage of my site