-
Book Overview & Buying
-
Table Of Contents
-
Feedback & Rating

ChatGPT for Cybersecurity Cookbook
By :

In this recipe, we’ll create a Linux-style manual page generator that will accept user input in the form of a tool’s name, and our script will generate the manual page output, similar to entering the man
command in Linux Terminal. In doing so, we will learn how to use variables in a text file to create a standard prompt template that can be easily modified by changing certain aspects of it. This approach is particularly useful when you want to use user input or other dynamic content as part of the prompt while maintaining a consistent structure.
Ensure you have access to the ChatGPT API by logging in to your OpenAI account and have Python and the openai
module installed.
Using a text file that contains the prompt and placeholder variables, we can create a Python script that will replace the placeholder with user input. In this example, we will use this technique to create a Linux-style manual page generator. Here are the steps:
from openai import OpenAI
def open_file(filepath): with open(filepath, 'r', encoding='UTF-8') as infile: return infile.read()
openai.api_key = open_file('openai-key.txt')
openai-key.txt
file in the same manner as the previous recipe.get_chat_gpt_response()
function to send the prompt to ChatGPT and obtain a response:client = OpenAI() def get_chat_gpt_response(prompt): response = client.chat.completions.create( model="gpt-3.5-turbo", messages=[{"role": "user", "content": prompt}], max_tokens=600, temperature=0.7 ) text = response.choices[0].message.content.strip() return text
file = input("ManPageGPT> $ Enter the name of a tool: ") feed = open_file(file)
<<INPUT>>
variable in the prompt.txt
file with the content of the file:prompt = open_file("prompt.txt").replace('<<INPUT>>', feed)
prompt.txt
file with the following text:Provide the manual-page output for the following tool. Provide the output exactly as it would appear in an actual Linux terminal and nothing else before or after the manual-page output. <<INPUT>>
get_chat_gpt_response()
function and print the result:analysis = get_chat_gpt_response(prompt) print(analysis)
Here’s an example of how the complete script should look:
import openai from openai import OpenAI def open_file(filepath): with open(filepath, 'r', encoding='UTF-8') as infile: return infile.read() openai.api_key = open_file('openai-key.txt') client = OpenAI() def get_chat_gpt_response(prompt): response = client.chat.completions.create( model="gpt-3.5-turbo", messages=[{"role": "user", "content": prompt}], max_tokens=600, temperature=0.7 ) text = response['choices'][0]['message']['content'].strip() return text feed = input("ManPageGPT> $ Enter the name of a tool: ") prompt = open_file("prompt.txt").replace('<<INPUT>>', feed) analysis = get_chat_gpt_response(prompt) print(analysis)
In this example, we created a Python script that utilizes a text file as a prompt template. The text file contains a variable called <<INPUT>>
that can be replaced with any content, allowing for dynamic modification of the prompt without the need to change the overall structure. Specifically for this case, we are replacing it with user input:
openai
module is imported to access the ChatGPT API, and the os
module is imported to interact with the operating system and manage environment variables.open_file()
function is defined to open and read a file. It takes a file path as an argument, opens the file with read access and UTF-8 encoding, reads the content, and then returns the content.open_file()
function and then assigning it to openai.api_key
.get_chat_gpt_response()
function is defined to send a prompt to ChatGPT and return the response. It takes the prompt as an argument, configures the API request with the desired settings, and then sends the request to the ChatGPT API. It extracts the response text, removes leading and trailing whitespaces, and returns it.<<INPUT>>
variable in the prompt.txt
file is replaced with the content of the file provided by the user. This is done using Python’s string replace()
method, which searches for the specified placeholder and replaces it with the desired content.man
command.<<INPUT>>
placeholder replaced, is sent to the get_chat_gpt_response()
function. The function sends the prompt to ChatGPT, which retrieves the response, and the script prints the analysis result. This demonstrates how to use a prompt template with a variable that can be replaced to create customized prompts for different inputs.This approach is particularly useful in a cybersecurity context as it allows you to create standard prompt templates for different types of analysis or queries and easily modify the input data as needed.
<<INPUT>>
format, you can customize your variable format to better suit your needs or preferences. For example, you can use curly braces (for example, {input}
) or any other format that you find more readable and manageable.open_file()
function to read an environment variable instead of a file, ensuring that sensitive data is not accidentally leaked or exposed.By exploring these additional techniques, you can create more powerful, flexible, and secure prompt templates for use with ChatGPT in your cybersecurity projects.
Change the font size
Change margin width
Change background colour