Come usare GPT-3, GPT-J e GPT-Neo, con l'apprendimento di pochi colpi

GPT-3, GPT-J e GPT-Neo sono modelli AI molto potenti. Qui vi mostriamo come utilizzare efficacemente questi modelli grazie all'apprendimento a pochi scatti. L'apprendimento in pochi scatti è come addestrare/finire la messa a punto di un modello AI, semplicemente dando un paio di esempi nel tuo prompt.

GPT-3

GPT-3, rilasciato da OpenAI, è il modello di IA più potente mai rilasciato per la comprensione del testo e la generazione di testo.

È stato addestrato su 175 miliardi di parametri, il che lo rende estremamente versatile e capace di capire praticamente tutto!

È possibile fare ogni sorta di cose con GPT-3 come chatbot, creazione di contenuti, estrazione di entità, classificazione, riassunto e molto altro. Ma ci vuole un po' di pratica e usare correttamente questo modello non è facile.

GPT-J e GPT-Neo

GPT-Neo e GPT-J sono entrambi modelli open-source di elaborazione del linguaggio naturale, creati da un collettivo di ricercatori che lavorano per l'AI open source (vedere il sito web di EleutherAI).

GPT-J ha 6 miliardi di parametri, il che lo rende il più avanzato modello open-source di elaborazione del linguaggio naturale al momento in cui scriviamo. È un'alternativa diretta al proprietario GPT-3 Curie di OpenAI.

Questi modelli sono molto versatili. Possono essere utilizzati per quasi tutti i casi d'uso di Natural Language Processing: generazione di testo, sentiment analisi del sentimento, classificazione, traduzione automatica, ... e molto altro (vedi sotto). Tuttavia usarli efficacemente a volte richiede pratica. Il loro tempo di risposta (latenza) potrebbe anche essere più lungo dei modelli di Natural Language Processing più standard modelli di Natural Language Processing.

GPT-J e GPT-Neo sono entrambi disponibili sull'API NLP Cloud. Di seguito, vi mostriamo degli esempi ottenuti usando l GPT-J endpoint di NLP Cloud su GPU, con il client Python. Se vuoi copiare e incollare gli esempi, per favore non dimenticare di aggiungere il tuo token API. Per installare il client Python, per prima cosa esegui quanto segue: pip install nlpcloud.

Apprendimento a pochi colpi

L'apprendimento a pochi scatti consiste nell'aiutare un modello di apprendimento automatico a fare previsioni grazie a solo un paio di esempi. Non c'è bisogno di addestrare un nuovo modello: modelli come GPT-3, GPT-J e GPT-Neo sono così grandi che possono facilmente adattarsi a molti contesti senza essere ri-addestrati.

Dare solo pochi esempi al modello lo aiuta ad aumentare drasticamente la sua precisione.

Nel Natural Language Processing, l'idea è di passare questi esempi insieme al tuo input di testo. Vedi gli esempi qui sotto!

Notate anche che, se l'apprendimento a pochi colpi non è sufficiente, potete anche mettere a punto GPT-3 sul sito di OpenAI e GPT-J su NLP Cloud in modo che il modello sia perfettamente adattato al vostro caso d'uso.

Si può facilmente testare l'apprendimento di pochi colpi sul parco giochi NLP Cloud (provalo qui).

Analisi del sentimento con GPT-J

Test sul parco giochi

import nlpcloud
client = nlpcloud.Client("gpt-j", "your_token", gpu=True)
generation = client.generation("""Message: Support has been terrible for 2 weeks...
            Sentiment: Negative
            ###
            Message: I love your API, it is simple and so fast!
            Sentiment: Positive
            ###
            Message: GPT-J has been released 2 months ago.
            Sentiment: Neutral
            ###
            Message: The reactivity of your team has been amazing, thanks!
            Sentiment:""",
    min_length=1,
    max_length=1,
    length_no_input=True,
    end_sequence="###",
    remove_end_sequence=True,
    remove_input=True)
print(generation["generated_text"])

Uscita:

Positive

Come potete vedere, il fatto di dare prima 3 esempi con un formato adeguato, porta GPT-J a capire che vogliamo eseguire l'analisi del sentimento. E il suo risultato è buono.

Potete aiutare GPT-J a capire le diverse usando un delimitatore personalizzato come il seguente: ###. Potremmo perfettamente usare qualcos'altro come questo: ---. O semplicemente una nuova linea. Poi impostiamo "end_sequence" che è un parametro di NLP Cloud che dice a GPT-J di smettere di generare contenuti dopo una nuova linea + ###: end_sequence="###".

Generazione di codice HTML con GPT-J

import nlpcloud
client = nlpcloud.Client("gpt-j", "your_token", gpu=True)
generation = client.generation("""description: a red button that says stop
    code: <button style=color:white; background-color:red;>Stop</button>
    ###
    description: a blue box that contains yellow circles with red borders
    code: <div style=background-color: blue; padding: 20px;><div style=background-color: yellow; border: 5px solid red; border-radius: 50%; padding: 20px; width: 100px; height: 100px;>
    ###
    description: a Headline saying Welcome to AI
    code:""",
    max_length=500,
    length_no_input=True,
    end_sequence="###",
    remove_end_sequence=True,
    remove_input=True)
print(generation["generated_text"])

Uscita:

<h1 style=color: white;>Welcome to AI</h1>

La generazione di codice con GPT-J è davvero sorprendente. Questo è in parte dovuto al fatto che GPT-J è stato addestrato su enormi basi di codice.

Generazione di codice SQL con GPT-J

import nlpcloud
client = nlpcloud.Client("gpt-j", "your_token", gpu=True)
generation = client.generation("""Question: Fetch the companies that have less than five people in it.
            Answer: SELECT COMPANY, COUNT(EMPLOYEE_ID) FROM Employee GROUP BY COMPANY HAVING COUNT(EMPLOYEE_ID) < 5;
            ###
            Question: Show all companies along with the number of employees in each department
            Answer: SELECT COMPANY, COUNT(COMPANY) FROM Employee GROUP BY COMPANY;
            ###
            Question: Show the last record of the Employee table
            Answer: SELECT * FROM Employee ORDER BY LAST_NAME DESC LIMIT 1;
            ###
            Question: Fetch three employees from the Employee table;
            Answer:""",
    max_length=100,
    length_no_input=True,
    end_sequence="###",
    remove_end_sequence=True,
    remove_input=True)
print(generation["generated_text"])

Uscita:

SELECT * FROM Employee ORDER BY ID DESC LIMIT 3;

La generazione automatica di SQL funziona molto bene con GPT-J, soprattutto a causa della natura dichiarativa di SQL e il fatto che SQL è un linguaggio abbastanza limitato con relativamente poche possibilità (rispetto alla maggior parte linguaggi di programmazione).

Estrazione avanzata di entità (NER) con GPT-J

Test sul parco giochi

import nlpcloud
client = nlpcloud.Client("gpt-j", "your_token", gpu=True)
generation = client.generation("""[Text]: Fred is a serial entrepreneur. Co-founder and CEO of Platform.sh, he previously co-founded Commerce Guys, a leading Drupal ecommerce provider. His mission is to guarantee that as we continue on an ambitious journey to profoundly transform how cloud computing is used and perceived, we keep our feet well on the ground continuing the rapid growth we have enjoyed up until now. 
        [Name]: Fred
        [Position]: Co-founder and CEO
        [Company]: Platform.sh
        ###
        [Text]: Microsoft (the word being a portmanteau of "microcomputer software") was founded by Bill Gates on April 4, 1975, to develop and sell BASIC interpreters for the Altair 8800. Steve Ballmer replaced Gates as CEO in 2000, and later envisioned a "devices and services" strategy.
        [Name]:  Steve Ballmer
        [Position]: CEO
        [Company]: Microsoft
        ###
        [Text]: Franck Riboud was born on 7 November 1955 in Lyon. He is the son of Antoine Riboud, the previous CEO, who transformed the former European glassmaker BSN Group into a leading player in the food industry. He is the CEO at Danone.
        [Name]:  Franck Riboud
        [Position]: CEO
        [Company]: Danone
        ###
        [Text]: David Melvin is an investment and financial services professional at CITIC CLSA with over 30 years’ experience in investment banking and private equity. He is currently a Senior Adviser of CITIC CLSA.
""",
    length_no_input=True,
    end_sequence="###",
    remove_end_sequence=True,
    remove_input=True)
print(generation["generated_text"])

Uscita:

[Name]: David Melvin
[Position]: Senior Adviser
[Company]: CITIC CLSA

Come potete vedere, GPT-J è molto bravo ad estrarre dati strutturati da testo non strutturato. È davvero impressionante come GPT-J risolva l'estrazione di entità senza che sia necessario alcun re-training! Di solito, estrarre nuovi tipi di entità (come nome, posizione, paese, ecc.) richiede un intero nuovo processo di annotazione, addestramento, distribuzione... Qui, è completamente senza soluzione di continuità.

Risposta alle domande con GPT-J

Test sul parco giochi

import nlpcloud
client = nlpcloud.Client("gpt-j", "your_token", gpu=True)
generation = client.generation("""Context: NLP Cloud was founded in 2021 when the team realized there was no easy way to reliably leverage Natural Language Processing in production.
            Question: When was NLP Cloud founded?
            Answer: 2021
            ###
            Context: NLP Cloud developed their API by mid-2020 and they added many pre-trained open-source models since then.
            Question: What did NLP Cloud develop?
            Answer: API
            ###
            Context: All plans can be stopped anytime. You only pay for the time you used the service. In case of a downgrade, you will get a discount on your next invoice.
            Question: When can plans be stopped?
            Answer: Anytime
            ###
            Context: The main challenge with GPT-J is memory consumption. Using a GPU plan is recommended.
            Question: Which plan is recommended for GPT-J?
            Answer:""",
    min_length=1,
    max_length=20,
    length_no_input=True,
    end_sequence="###",
    remove_end_sequence=True,
    remove_input=True)
print(generation["generated_text"])

Uscita:

Piano GPU

La risposta alle domande funziona molto bene. Può essere ottenuto anche con altri modelli di Natural Language Processing dedicati, ma forse non con lo stesso livello di accuratezza.

Creazione di tutorial con GPT-J

import nlpcloud
client = nlpcloud.Client("gpt-j", "your_token", gpu=True)
generation = client.generation("""Here is a tutorial about how to make a cake.
        
            1. Take some flour.
            2. Take some sugar.""",
    max_length=500)
print(generation["generated_text"])

Uscita:

Here is a tutorial how to make a cake.
        
                    1. Take some flour.
                    2. Take some sugar.
                    3. Take some butter.
                    4. Take some eggs.
                    5. Take some water.
                    6. Take some baking powder.
                    7. Take some vanilla.
                    8. Mix all together.
                    9. Bake in a pan.
                    10. Enjoy.
                    
Well, that's it. You can make this for your birthday or a party or you can even make it for your kids. They will love this.

Come potete vedere è abbastanza impressionante come GPT-J segua automaticamente la vostra formattazione iniziale, e il generato è anche molto buono. Potresti creare una vera e propria torta con questo (non ho ancora provato però).

Correzione grammaticale e ortografica con GPT-J

Test sul il parco giochi

import nlpcloud
client = nlpcloud.Client("gpt-j", "your_token", gpu=True)
generation = client.generation("""I love goin to the beach.
            Correction: I love going to the beach.
            ###
            Let me hav it!
            Correction: Let me have it!
            ###
            It have too many drawbacks.
            Correction: It has too many drawbacks.
            ###
            I do not wan to go
            Correction:""",
    length_no_input=True,
    end_sequence="###",
    remove_end_sequence=True,
    remove_input=True)
print(generation["generated_text"])

Uscita:

Non voglio andare.

Le correzioni ortografiche e grammaticali funzionano come previsto. Se vuoi essere più specifico sulla posizione l'errore nella frase, potresti voler usare un modello dedicato.

Traduzione automatica con GPT-J

import nlpcloud
client = nlpcloud.Client("gpt-j", "your_token", gpu=True)
generation = client.generation("""Hugging Face a révolutionné le NLP.
            Translation: Hugging Face revolutionized NLP.
            ###
            Cela est incroyable!
            Translation: This is unbelievable!
            ###
            Désolé je ne peux pas.
            Translation: Sorry but I cannot.
            ###
            NLP Cloud permet de deployer le NLP en production facilement.
            Translation""",
    length_no_input=True,
    end_sequence="###",
    remove_end_sequence=True,
    remove_input=True)
print(generation["generated_text"])

Uscita:

NLP Cloud makes it easy to deploy NLP to production.

La traduzione automatica di solito richiede modelli dedicati (spesso 1 per lingua). Qui tutte le lingue sono gestite fuori dalla scatola da GPT-J, il che è abbastanza impressionante.

Generazione di tweet con GPT-J

import nlpcloud
client = nlpcloud.Client("gpt-j", "your_token", gpu=True)
generation = client.generation("""keyword: markets
            tweet: Take feedback from nature and markets, not from people
            ###
            keyword: children
            tweet: Maybe we die so we can come back as children.
            ###
            keyword: startups
            tweet: Startups should not worry about how to put out fires, they should worry about how to start them.
            ###
            keyword: NLP
            tweet:""",
    max_length=200,
    length_no_input=True,
    end_sequence="###",
    remove_end_sequence=True,
    remove_input=True)
print(generation["generated_text"])

Uscita:

People want a way to get the benefits of NLP without paying for it.

Ecco un modo divertente e facile per generare brevi tweet seguendo un contesto.

Chatbot e IA conversazionale con GPT-J

Test sul il parco giochi

import nlpcloud
client = nlpcloud.Client("gpt-j", "your_token", gpu=True)
generation = client.generation("""This is a discussion between a [human] and a [robot]. 
The [robot] is very nice and empathetic.

[human]: Hello nice to meet you.
[robot]: Nice to meet you too.
###
[human]: How is it going today?
[robot]: Not so bad, thank you! How about you?
###
[human]: I am ok, but I am a bit sad...
[robot]: Oh? Why that?
###
[human]: I broke up with my girlfriend...
[robot]: """,
    min_length=1,
    max_length=20,
    length_no_input=True,
    end_sequence="###",
    remove_end_sequence=True,
    remove_input=True)
print(generation["generated_text"])

Uscita:

Oh? How did that happen?

Come potete vedere, GPT-J capisce correttamente che siete in una modalità di conversazione. E la cosa molto potente è che, se cambiate il tono nel vostro contesto, le risposte del modello seguiranno lo stesso tono (sarcasmo, rabbia, curiosità...).

In realtà abbiamo scritto un articolo di blog dedicato su come costruire un chatbot con GPT-3/GPT-J, sentitevi liberi di leggerlo!

Classificazione dell'intento con GPT-J

Test sul parco giochi

import nlpcloud
client = nlpcloud.Client("gpt-j", "your_token", gpu=True)
generation = client.generation("""I want to start coding tomorrow because it seems to be so fun!
            Intent: start coding
            ###
            Show me the last pictures you have please.
            Intent: show pictures
            ###
            Search all these files as fast as possible.
            Intent: search files
            ###
            Can you please teach me Chinese next week?
            Intent:""",
    length_no_input=True,
    end_sequence="###",
    remove_end_sequence=True,
    remove_input=True)
print(generation["generated_text"])

Uscita:

learn chinese

È abbastanza impressionante come GPT-J possa rilevare l'intento dalla tua frase. Funziona molto bene per frasi frasi complesse. Puoi anche chiedergli di formattare l'intento in modo diverso, se volete. Per esempio potreste generare automaticamente un nome di funzione Javascript come "learnChinese".

Parafrasare con GPT-J

Test sul parco giochi

import nlpcloud
client = nlpcloud.Client("gpt-j", "your_token", gpu=True)
generation = client.generation("""[Original]: Algeria recalled its ambassador to Paris on Saturday and closed its airspace to French military planes a day later after the French president made comments about the northern Africa country. 
        [Paraphrase]: Last Saturday, the Algerian government recalled its ambassador and stopped accepting French military airplanes in its airspace. It happened one day after the French president made comments about Algeria.
        ###
        [Original]: President Macron was quoted as saying the former French colony was ruled by a "political-military system" with an official history that was based not on truth, but on hatred of France.
        [Paraphrase]: Emmanuel Macron said that the former colony was lying and angry at France. He also said that the country was ruled by a "political-military system".
        ###
        [Original]: The diplomatic spat came days after France cut the number of visas it issues for citizens of Algeria and other North African countries.
        [Paraphrase]: Diplomatic issues started appearing when France decided to stop granting visas to Algerian people and other North African people.
        ###
        [Original]: After a war lasting 20 years, following the decision taken first by President Trump and then by President Biden to withdraw American troops, Kabul, the capital of Afghanistan, fell within a few hours to the Taliban, without resistance.
        [Paraphrase]:""",
    length_no_input=True,
    end_sequence="###",
    remove_end_sequence=True,
    remove_input=True,
    min_length=0,
    max_length=50)
print(generation["generated_text"])

Uscita:

French President Emmanuel Macron hopes the diplomatic tension with Algeria will calm down. He wants to make progress in the relations with the former French colony. President Macron says he has a good relationship with his Algerian counterpart and he is glad that they have a cordial relationship. The president was quoted as saying that Algeria was ruled by a "political-military system" with an official history that was based not on truth, but on hatred of France. France cut the number of visas it issues for citizens of Algeria and other North African countries.

GPT-J ha apportato modifiche al nostro paragrafo, pur mantenendo il senso principale, che è quello che parafrasare. Si potrebbe perfettamente incoraggiare GPT-J a restituire parafrasi più originali, passando passando diversi esempi nell'input, e giocando con i parametri API come temperatura, top_p, pena di ripetizione...

Riassunto con GPT-J

Test sul parco giochi

import nlpcloud
client = nlpcloud.Client("gpt-j", "your_token", gpu=True)
generation = client.generation("""[Original]: America has changed dramatically during recent years. Not only has the number of graduates in traditional engineering disciplines such as mechanical, civil, electrical, chemical, and aeronautical engineering declined, but in most of the premier American universities engineering curricula now concentrate on and encourage largely the study of engineering science.  As a result, there are declining offerings in engineering subjects dealing with infrastructure, the environment, and related issues, and greater concentration on high technology subjects, largely supporting increasingly complex scientific developments. While the latter is important, it should not be at the expense of more traditional engineering.
        Rapidly developing economies such as China and India, as well as other industrial countries in Europe and Asia, continue to encourage and advance the teaching of engineering. Both China and India, respectively, graduate six and eight times as many traditional engineers as does the United States. Other industrial countries at minimum maintain their output, while America suffers an increasingly serious decline in the number of engineering graduates and a lack of well-educated engineers. 
        (Source:  Excerpted from Frankel, E.G. (2008, May/June) Change in education: The cost of sacrificing fundamentals. MIT Faculty 
        [Summary]: MIT Professor Emeritus Ernst G. Frankel (2008) has called for a return to a course of study that emphasizes the traditional skills of engineering, noting that the number of American engineering graduates with these skills has fallen sharply when compared to the number coming from other countries. 
        ###
        [Original]: So how do you go about identifying your strengths and weaknesses, and analyzing the opportunities and threats that flow from them? SWOT Analysis is a useful technique that helps you to do this.
        What makes SWOT especially powerful is that, with a little thought, it can help you to uncover opportunities that you would not otherwise have spotted. And by understanding your weaknesses, you can manage and eliminate threats that might otherwise hurt your ability to move forward in your role.
        If you look at yourself using the SWOT framework, you can start to separate yourself from your peers, and further develop the specialized talents and abilities that you need in order to advance your career and to help you achieve your personal goals.
        [Summary]: SWOT Analysis is a technique that helps you identify strengths, weakness, opportunities, and threats. Understanding and managing these factors helps you to develop the abilities you need to achieve your goals and progress in your career.
        ###
        [Original]: Jupiter is the fifth planet from the Sun and the largest in the Solar System. It is a gas giant with a mass one-thousandth that of the Sun, but two-and-a-half times that of all the other planets in the Solar System combined. Jupiter is one of the brightest objects visible to the naked eye in the night sky, and has been known to ancient civilizations since before recorded history. It is named after the Roman god Jupiter.[19] When viewed from Earth, Jupiter can be bright enough for its reflected light to cast visible shadows,[20] and is on average the third-brightest natural object in the night sky after the Moon and Venus.
        Jupiter is primarily composed of hydrogen with a quarter of its mass being helium, though helium comprises only about a tenth of the number of molecules. It may also have a rocky core of heavier elements,[21] but like the other giant planets, Jupiter lacks a well-defined solid surface. Because of its rapid rotation, the planet's shape is that of an oblate spheroid (it has a slight but noticeable bulge around the equator).
        [Summary]: Jupiter is the largest planet in the solar system. It is a gas giant, and is the fifth planet from the sun.
        ###
        [Original]: For all its whizz-bang caper-gone-wrong energy, and for all its subsequent emotional troughs, this week’s Succession finale might have been the most important in its entire run. Because, unless I am very much wrong, Succession – a show about people trying to forcefully mount a succession – just had its succession. And now everything has to change.
        The episode ended with Logan Roy defying his children by selling Waystar Royco to idiosyncratic Swedish tech bro Lukas Matsson. It’s an unexpected twist, like if King Lear contained a weird new beat where Lear hands the British crown to Jack Dorsey for a laugh, but it sets up a bold new future for the show. What will happen in season four? Here are some theories.
        Season three of Succession picked up seconds after season two ended. It was a smart move, showing the immediate swirl of confusion that followed Kendall Roy’s decision to undo his father, and something similar could happen here. This week’s episode ended with three of the Roy siblings heartbroken and angry at their father’s grand betrayal. Perhaps season four could pick up at that precise moment, and show their efforts to reorganise their rebellion against him. This is something that Succession undoubtedly does very well – for the most part, its greatest moments have been those heart-thumping scenes where Kendall scraps for support to unseat his dad – and Jesse Armstrong has more than enough dramatic clout to centre the entire season around the battle to stop the Matsson deal dead in its tracks.
        [Summary]:""",
    length_no_input=True,
    end_sequence="###",
    remove_end_sequence=True,
    remove_input=True,
    min_length=20,
    max_length=200)
print(generation["generated_text"])

Uscita:

Season 3 of Succession ended with Logan Roy trying to sell his company to Lukas Matsson.

Il riassunto del testo è un compito difficile. GPT-J è molto bravo in questo, finché gli si danno i giusti esempi. La dimensione del riassunto, e il tono del riassunto, dipendono molto dagli esempi che hai creati. Per esempio, potresti non creare lo stesso tipo di esempi, sia che tu stia cercando di fare un semplice riassunto per i bambini, o un riassunto medico avanzato per i medici. Se la dimensione di input di GPT-J è troppo piccola per i vostri esempi di riassunto, potreste voler mettere a punto GPT-J per il vostro compito di riassunto.

Classificazione del testo a zero colpi con GPT-J

Test sul parco giochi

import nlpcloud
client = nlpcloud.Client("gpt-j", "your_token", gpu=True)
generation = client.generation("""Message: When the spaceship landed on Mars, the whole humanity was excited
        Topic: space
        ###
        Message: I love playing tennis and golf. I'm practicing twice a week.
        Topic: sport
        ###
        Message: Managing a team of sales people is a tough but rewarding job.
        Topic: business
        ###
        Message: I am trying to cook chicken with tomatoes.
        Topic:""",
    min_length=1,
    max_length=5,
    length_no_input=True,
    end_sequence="###",
    remove_end_sequence=True,
    remove_input=True)
print(generation["generated_text"])

Uscita:

food

Ecco un modo facile e potente per categorizzare un pezzo di testo grazie alla cosiddetta tecnica "zero-shot apprendimento", senza dover dichiarare categorie in anticipo.

Estrazione di parole chiave e frasi chiave con GPT-J

Prova sul campo da gioco

import nlpcloud
client = nlpcloud.Client("gpt-j", "your_token", gpu=True)
generation = client.generation("""Information Retrieval (IR) is the process of obtaining resources relevant to the information need. For instance, a search query on a web search engine can be an information need. The search engine can return web pages that represent relevant resources.
        Keywords: information, search, resources
        ###
        David Robinson has been in Arizona for the last three months searching for his 24-year-old son, Daniel Robinson, who went missing after leaving a work site in the desert in his Jeep Renegade on June 23. 
        Keywords: searching, missing, desert
        ###
        I believe that using a document about a topic that the readers know quite a bit about helps you understand if the resulting keyphrases are of quality.
        Keywords: document, understand, keyphrases
        ###
        Since transformer models have a token limit, you might run into some errors when inputting large documents. In that case, you could consider splitting up your document into paragraphs and mean pooling (taking the average of) the resulting vectors.
        Keywords:""",
    length_no_input=True,
    end_sequence="###",
    remove_end_sequence=True,
    remove_input=True)
print(generation["generated_text"])

Uscita:

paragraphs, transformer, input, errors

L'estrazione di parole chiave consiste nell'ottenere le idee principali da un pezzo di testo. Questo è un interessante sottocampo del Natural Language Processing che GPT-J può gestire molto bene. Vedi sotto per l'estrazione di parole chiave (la stessa cosa ma con parole multiple).

import nlpcloud
client = nlpcloud.Client("gpt-j", "your_token", gpu=True)
generation = client.generation("""Information Retrieval (IR) is the process of obtaining resources relevant to the information need. For instance, a search query on a web search engine can be an information need. The search engine can return web pages that represent relevant resources.
        Keywords: information retrieval, search query, relevant resources
        ###
        David Robinson has been in Arizona for the last three months searching for his 24-year-old son, Daniel Robinson, who went missing after leaving a work site in the desert in his Jeep Renegade on June 23. 
        Keywords: searching son, missing after work, desert
        ###
        I believe that using a document about a topic that the readers know quite a bit about helps you understand if the resulting keyphrases are of quality.
        Keywords: document, help understand, resulting keyphrases
        ###
        Since transformer models have a token limit, you might run into some errors when inputting large documents. In that case, you could consider splitting up your document into paragraphs and mean pooling (taking the average of) the resulting vectors.
        Keywords:""",
    length_no_input=True,
    end_sequence="###",
    remove_end_sequence=True,
    remove_input=True)
print(generation["generated_text"])

Uscita:

large documents, paragraph, mean pooling

Stesso esempio di cui sopra, tranne che questa volta non vogliamo estrarre una sola parola ma diverse parole (chiamate keyphrase).

Descrizione del prodotto e generazione di annunci

Prova sul campo da gioco

import nlpcloud
client = nlpcloud.Client("gpt-j", "your_token", gpu=True)
generation = client.generation("""Generate a product description out of keywords.

        Keywords: shoes, women, $59
        Sentence: Beautiful shoes for women at the price of $59.
        ###
        Keywords: trousers, men, $69
        Sentence: Modern trousers for men, for $69 only.
        ###
        Keywords: gloves, winter, $19
        Sentence: Amazingly hot gloves for cold winters, at $19.
        ###
        Keywords: t-shirt, men, $39
        Sentence:""",
    min_length=5,
    max_length=30,
    length_no_input=True,
    end_sequence="###",
    remove_end_sequence=True,
    remove_input=True)
print(generation["generated_text"])

Uscita:

Extraordinary t-shirt for men, for $39 only.

È possibile chiedere a GPT-J di generare una descrizione del prodotto o un annuncio contenente parole chiave specifiche. Qui stiamo solo generando una semplice frase, ma potremmo facilmente generare un intero paragrafo se necessario.

Blog Post Generation

Test sul parco giochi

import nlpcloud
client = nlpcloud.Client("gpt-j", "your_token", gpu=True)
generation = client.generation("""[Title]: 3 Tips to Increase the Effectiveness of Online Learning
[Blog article]: <h1>3 Tips to Increase the Effectiveness of Online Learning</h1>
<p>The hurdles associated with online learning correlate with the teacher’s inability to build a personal relationship with their students and to monitor their productivity during class.</p>
<h2>1. Creative and Effective Approach</h2>
<p>Each aspect of online teaching, from curriculum, theory, and practice, to administration and technology, should be formulated in a way that promotes productivity and the effectiveness of online learning.</p>
<h2>2. Utilize Multimedia Tools in Lectures</h2>
<p>In the 21st century, networking is crucial in every sphere of life. In most cases, a simple and functional interface is preferred for eLearning to create ease for the students as well as the teacher.</p>
<h2>3. Respond to Regular Feedback</h2>
<p>Collecting student feedback can help identify which methods increase the effectiveness of online learning, and which ones need improvement. An effective learning environment is a continuous work in progress.</p>
###
[Title]: 4 Tips for Teachers Shifting to Teaching Online 
[Blog article]: <h1>4 Tips for Teachers Shifting to Teaching Online </h1>
<p>An educator with experience in distance learning shares what he’s learned: Keep it simple, and build in as much contact as possible.</p>
<h2>1. Simplicity Is Key</h2>
<p>Every teacher knows what it’s like to explain new instructions to their students. It usually starts with a whole group walk-through, followed by an endless stream of questions from students to clarify next steps.</p>
<h2>2. Establish a Digital Home Base</h2>
<p>In the spirit of simplicity, it’s vital to have a digital home base for your students. This can be a district-provided learning management system like Canvas or Google Classrooms, or it can be a self-created class website. I recommend Google Sites as a simple, easy-to-set-up platform.</p>
<h2>3. Prioritize Longer, Student-Driven Assignments</h2>
<p>Efficiency is key when designing distance learning experiences. Planning is going to take more time and require a high level of attention to detail. You will not be able to correct mistakes on the fly or suddenly pivot when kids are disengaged.</p>
<h2>4. Individual Touchpoints Are Game-Changers</h2>
<p>You can create these touchpoints through any medium you like: emails, video messages, phone calls, messages through your learning management system, comments on shared documents, etc.</p>
###
[Title]: 3 Best Free Image Converter Software Programs
[Blog article]: <h1>3 Best Free Image Converter Software Programs</h1>
<p>Best free image converters for JPG, BMP, PSD, TIF, GIF, RAW, and more</p>
<h2>1. XnConvert</h2>
<p>XnConvert is the Swiss Army knife of image converters. With XnView, you can convert any of around 500 image formats to your choice of around 80 others. If you have a rare image format that you can't open, XnView can probably convert it.</p>
<h2>2. CoolUtils Online Image Converter</h2>
<p>The CoolUtils Online Image Converter is just that—an image converter that exists completely online, no download required. Unlike some online image converters, this one does the image converting for you in real-time—no waiting on an email link.</p>
<h2>3. FileZigZag</h2>
<p>FileZigZag is another online image converter service that will convert most common graphics formats. Just upload the original image, choose the desired output, and then wait for the download link to appear on the page.</p>
###
[Title]: 4 Best Tips to Find a Girlfriend
[Blog article]: """,
    max_length=500,
    length_no_input=True,
    end_sequence="###",
    remove_end_sequence=True,
    remove_input=True)
print(generation["generated_text"])

Uscita:

<h1>Text Generation API</h1>
<h2>What is Text Generation?</h2>
<p>Text generation is the process of generating text from a block of text. It is not simply the inverse of summarization: a text generator will take a short piece of text as an input and generate a longer one.</p>
<p>For example you have the following text:</p>
Ahead of the opening of the meeting,
<p>In this case the goal of the API is to automatically generate the following:</p>
Ahead of the opening of the meeting, Prime Minister Justin Trudeau spoke to reporters at the Montreal airport where he was travelling to speak at the conference.
<p>So basically we want to leverage the output of the API as the input for the text generator, and we will then be able to generate some sentences based on the input we already have.</p>
<h2>Text generation with GPT-2</h2>
<p>We will leverage GPT-2 in order to have a starting point for the text generator. In this case, we will use the API as the input for the text generator, and we will use the output of the API as the input for GPT-2.
However, we want to fine-tune GPT-2 to generate more complex sentences and to have better results.</p>

Non è impressionante? Questo articolo di blog generato è piccolo, ma è possibile generare articoli molto più lunghi. Il struttura dell'articolo di blog generato dipende davvero dalla struttura che hai usato nei tuoi esempi di pochi scatti. Per ottenere strutture più complesse e contenuti più rilevanti, la chiave è la messa a punto di GPT-J.

Conclusione

Come potete vedere, l'apprendimento a pochi colpi è una grande tecnica che aiuta GPT-3, GPT-J e GPT-Neo a ottenere cose sorprendenti cose sorprendenti! La chiave qui è passare un contesto corretto prima di fare la richiesta.

Anche per la semplice generazione di testo, si raccomanda di passare più contesto possibile, al fine di aiutare il modello.

Spero che l'abbiate trovato utile! Se hai qualche domanda su come sfruttare al massimo questi modelli, non non esitare a chiederci.

Julien Salinas
CTO di NLP Cloud