Saturday, 29 November 2025

Prompt Engineering using Jemini

 HI all

Question : Prompt engineering code in Python which accepts some prompt and get the response from Gemini using the API key

Answer code

---------------------

from google import genai

from google.genai.errors import APIError


# NOTE: Hardcoding API keys is generally discouraged in production environments

# for security reasons. Environment variables or secret management tools are preferred.

# This is done here only to fulfill the specific request.


HARDCODED_API_KEY ="Your API code generated from Google AI  Studio"


def initialize_gemini(api_key: str):

    """Initializes the Gemini client using the hardcoded API key."""

    try:

        # Pass the API key directly to the Client constructor

        client = genai.Client(api_key=api_key)

        return client

    except Exception as e:

        # Note: If the API key is invalid or permissions are wrong, an error

        # might be raised here or during the first API call.

        print(f"Error initializing client: {e}")

        return None


def run_prompt_engineering_lesson(client):

    """

    Demonstrates prompt engineering using a Few-Shot Prompting technique.

    The goal is to teach the model a new, specific output format.

    """

    if not client:

        print("Cannot run lesson because the Gemini client failed to initialize.")

        return


    print("--- 📚 Prompt Engineering Lesson: Few-Shot Prompting ---")

    

    # --- 1. The Prompt ---

    # We provide a few examples ("shots") to guide the model's behavior.

    # The goal is to make it translate casual terms into formal business language.

    prompt = """

    **Instructions:** You are an expert business communication consultant.

    Your task is to translate casual, everyday phrases into formal, professional business language.


    **Examples (Few-Shots):**

    1. Casual: "We messed up the schedule."

    Formal: "We encountered an unforeseen discrepancy in the project timeline."

    2. Casual: "Can you email me the slides?"

    Formal: "Kindly forward the presentation deck via electronic mail."

    3. Casual: "Let's catch up later today."

    Formal: "I propose we schedule a debriefing session later this afternoon."


    **Your Turn (The Query):**

    4. Casual: "The project's going nowhere fast."

    Formal:"""

    prompt=input('Enter a query')


    print("\n[Input Prompt Sent to Gemini Model]:")

    print("----------------------------------------------------------------")

    print(prompt.strip())

    print("----------------------------------------------------------------")


    # --- 2. API Call ---

    # We use a powerful model like gemini-2.5-flash

    try:

        response = client.models.generate_content(

            model='gemini-2.5-flash',

            contents=prompt,

            # Adjusting temperature for more consistent responses.

            config={"temperature": 0.2} 

        )


        # --- 3. Output ---

        print("\n[Model Response Received]:")

        print("----------------------------------------------------------------")

        

        # We process the response text to extract only the formal translation

        full_text = response.text

        if "Formal:" in full_text:

            # Attempt to split to get the translation following the last "Formal:" marker

            formal_translation = full_text.split("Formal:")[-1].strip()

        else:

            # If the model only returned the answer, just strip it

            formal_translation = full_text.strip()

            

        print(formal_translation)

        print("----------------------------------------------------------------")


        print("\n**✨ Prompt Engineering Concept Demonstrated: Few-Shot Learning.**")

        print("By providing examples, we taught the model a specific format (Casual -> Formal) without explicit coding.")


    except APIError as e:

        print(f"\nAn API error occurred: {e}")

        print("Check your API key and ensure you haven't exceeded the free tier quota.")

    except Exception as e:

        print(f"\nAn unexpected error occurred: {e}")



if __name__ == "__main__":

    gemini_client = initialize_gemini(HARDCODED_API_KEY)

    if gemini_client:

        run_prompt_engineering_lesson(gemini_client)


Saturday, 6 April 2024

Save tokenizer ,model and load

 Hi all

-----------------------------------------

from google.colab import drive
drive.mount('/content/drive')
cd /content/drive/MyDrive/HUPHealth

import joblib

file_path = "model.joblib"

joblib.dump(model, file_path)

-------------------------------------------

from tensorflow.keras.models import load_model

emotion_model = load_model("HUPYogadeep.h5") 

--------------------------------------

 from keras.models import load_model
 hupmodel= load_model("hup_sentimental_lstm.h5")
------------------------------
from keras.models import load_model
model.save("hupmodel18.h5")
-------------------------------
from sklearn.preprocessing import LabelEncoder
import joblib

labelencoder = LabelEncoder()

y = labelencoder.fit_transform(df['cyberbullying_type'])
joblib.dump(labelencoder, 'huplabelencoder.pkl')
from tensorflow.keras.utils import to_categorical

y1 = to_categorical(y)
--------------------------------------------
import nltk
from nltk.corpus import stopwords
from nltk.stem.porter import PorterStemmer
import pandas as pd
import re
import joblib

nltk.download('stopwords')
port_stem = PorterStemmer()
def stemming(content):
    stemmed_content = re.sub('[^a-zA-Z]',' ',content)
    stemmed_content = stemmed_content.lower()
    stemmed_content = stemmed_content.split()
    stemmed_content = [port_stem.stem(word) for word in stemmed_content if not word in stopwords.words('english')]
    stemmed_content = ' '.join(stemmed_content)
    return stemmed_content

hupstr=input('Enter the tweet')
df1 = pd.DataFrame({'text': [hupstr]})
df1['text']  = df1['text'].apply(stemming)

from keras.models import load_model
##hupmodel= load_model("hupmodel17.h5")
hupmodel= load_model("my_model.keras")


loaded_vectorizer = joblib.load('huptokenizer.joblib')
X2 = loaded_vectorizer.texts_to_sequences(df1['text'])
X2 = pad_sequences(X2,maxlen=337)


hupprediction1 = hupmodel.predict_on_batch(np.stack(X2))
labelencoder = joblib.load('huplabelencoder.pkl')
huplabel1 = labelencoder.inverse_transform(np.argmax(hupprediction1, axis=1))
hupresult1 = ''.join(huplabel1)
print(hupresult1)
------------
hupprediction1 = model.predict([data3]).argmax(axis=1)
OR
hupprediction1 = model.predict_on_batch(np.stack(data3))

Saturday, 16 March 2024

Text to Vector , Preprocessing and Loading to mode

 Hi

import nltk
from nltk.corpus import stopwords
from nltk.stem.porter import PorterStemmer
import pandas as pd
import re


nltk.download('stopwords')
port_stem = PorterStemmer()
def stemming(content):
    stemmed_content = re.sub('[^a-zA-Z]',' ',content)
    stemmed_content = stemmed_content.lower()
    stemmed_content = stemmed_content.split()
    stemmed_content = [port_stem.stem(word) for word in stemmed_content if not word in stopwords.words('english')]
    stemmed_content = ' '.join(stemmed_content)
    return stemmed_content

hupstr=input('Enter the tweet')
df1 = pd.DataFrame({'text': [hupstr]})
df1['text']  = df1['text'].apply(stemming)

from keras.models import load_model
hupmodel= load_model("hupmodel17.h5")


loaded_vectorizer = joblib.load('huptokenizer.joblib')
X2 = loaded_vectorizer.texts_to_sequences(df1['text'])
X2 = pad_sequences(X2,maxlen=337)

#X_test_transformed = loaded_vectorizer.transform(hupinput)
#X_test_dense = X_test_transformed.toarray()
hupprediction1 = hupmodel.predict_on_batch(np.stack(X2))
labelencoder = joblib.load('huplabelencoder.pkl')
huplabel1 = labelencoder.inverse_transform(np.argmax(hupprediction1, axis=1))
hupresult1 = ''.join(huplabel1)
print(hupresult1)

Print Tensorflow version

 Hi 

Try this


import tensorflow as tf print(tf.__version__) import keras print(keras.__version__)

Tuesday, 12 March 2024

How Flask file shows templates and images

 Hi

Normal Flask code look like

--------------

from flask import Flask, redirect, url_for, request
app = Flask(__name__)
 
 
@app.route('/success/<name>')
def success(name):
    return 'Hi %s' % name
 
 
@app.route('/login', methods=['POST', 'GET'])
def login():
    if request.method == 'POST':
        user = request.form['name']
        return redirect(url_for('success', name=user))
    else:
        user = request.args.get('name')
        return redirect(url_for('success', name=user))
 
 
if __name__ == '__main__':
    app.run(debug=True)

--------------

If you need template page, your flask file should return the following code

return render_template('template.html', result=hupresult)

Your template.html should be saved inside templates folder

and the code template.html should look like


<!DOCTYPE html>

<html lang="en">

<head>

    <meta charset="UTF-8">

    <meta name="viewport" content="width=device-width, initial-scale=1.0">

    <title>Flask Image Example</title>

</head>

<body>

    <h1>{{ result }}</h1>


    <!-- Display the image from the static folder -->

    <img src="{{ url_for('static', filename='bg.jpg') }}" alt="Your Image">

</body>

</html>


The image bg.jpg should be saved inside 'static' folder


Tuesday, 5 March 2024

Face Emotion Detection - Colab

 Hi all

Step1 : Open the colab

Step2 : File -> Save a copy in Drive

Step3 : Work with the copy

Link

https://colab.research.google.com/drive/1K-XhE_qW0VLWrPmjycz8kriRYmA77M33?usp=sharing

Face emotion dataset

 Hi all


https://drive.google.com/drive/folders/1PaAA1P_5O8bNV42AHxNEHyzmQuMsR3kW?usp=sharing