Using OpenAI to hack Javascript

· jswank's blog


I know enough Javascript to be dangerous - I can hack on it, but my code will be sloppy and ill-informed! I've also been experimenting with the OpenAI API. What can go wrong?

Background #

While I was changing the connection settings for a project's S3 image upload functionality, I discovered that the code would not function correctly if the URL for the region was not specified. I'd incorrectly assumed that specifying a region in the configuration of the service would be sufficient, so I took a look closer look at the code.

My plan, in a nutshell:

Notes #

The primary way I've been interfacing with OpenAI is using sigoden/aichat - my examples reflect that.

The Module #

Here is the module I started with- it was yanked from the middle of a codebase.

 1// S3Uploader.js
 2const aws = require('aws-sdk')
 3
 4module.exports = class S3Uploader {
 5  constructor(urlBase, region, accessKeyId, secretAccessKey, bucket) {
 6    this.urlBase = urlBase + bucket
 7    this.region = region
 8    this.bucket = bucket
 9
10    this.s3 = new aws.S3({
11      region,
12      accessKeyId,
13      secretAccessKey
14    })
15  }
16
17  upload(path, data) {
18    return new Promise((resolve, reject) => {
19      this.s3.putObject({
20        Body: data,
21        Bucket: this.bucket,
22        Key: path
23      }, (err) => {
24        if (err) reject(err)
25        else resolve(this.urlBase + '/' + path)
26      })
27    })
28  }
29}

I know from experience with other AWS SDKs in other languages that authentication and connection information can be defined more generally than this - specifically by using AWS specific environment variables like AWS_REGION and AWS_ACCESS_KEY_ID as part of credential chains. Depending on the operating environment, this more generalized specification is important to support- for instance, if code is to make use of instance profiles or role assumption.

Some thoughts I had on this code included:

  1. Using AWS credentials in this fashion is not required - rely on the SDK to use the AWS_ environment variables directly.
  2. Requiring the region to be set in this fashion is not required - rely on the variables used by the SDK, and set a default.
  3. Constructing URLs for S3 buckets in this fasion (path-based) is deprecated: a virtual host approach should be used.
  4. The version of the SDK is v2 - making this use v3 should be trivial.

Creating a Test Harness #

I asked for some assistance to create some test code for the module as it existed.

1$ cat S3Uploader.js | aichat -r coder "Write a test harness for this module. " | tee test.js

The response:

 1const assert = require('assert')                                                                            
 2const S3Uploader = require('./S3Uploader')            
 3                                                      
 4describe('S3Uploader', () => {                        
 5  const urlBase = 'https://s3.amazonaws.com/'         
 6  const region = 'us-west-2'                          
 7  const accessKeyId = 'ACCESS_KEY_ID'                                                                       
 8  const secretAccessKey = 'SECRET_ACCESS_KEY'         
 9  const bucket = 'my-bucket'                          
10  const uploader = new S3Uploader(urlBase, region, accessKeyId, secretAccessKey, bucket)
11                                                                                                            
12  describe('#upload', () => {         
13    it('should upload a file to S3', async () => {
14      const path = 'test.txt'
15      const data = 'Hello, world!'
16      const url = await uploader.upload(path, data)
17      assert.strictEqual(url, `${urlBase}${bucket}/${path}`)
18    })
19  })
20})

Structure the Directory #

I wanted some help building the directory structure and scaffolding for doing the work.

1$ cat test.js | aichat << 'EOP'
2  Here is javascript code for testing the S3Uploader module. What should the
3  structure of the directory with this code be in order to run the test using
4  `npm test`.  Also output an appropriate package.json file.
5EOP

The response:

The directory structure should be as follows:

- project/
  - S3Uploader.js
  - test/
    - S3Uploader.test.js
  - package.json

The S3Uploader.js file should contain the implementation of the S3Uploader module, while the S3Uploader.test.js file should contain the test code shown above.

The package.json file should contain the following:

{
  "name": "my-project",
  "version": "1.0.0",
  "description": "My project description",
  "main": "S3Uploader.js",
  "scripts": {
    "test": "mocha"
  },
  "devDependencies": {
    "mocha": "^9.1.3",
    "assert": "^2.0.0"
  }
}

This assumes that mocha and assert are the only dev dependencies needed to run the tests. Adjust the devDependencies section as necessary for your project.

The Interim Result #

This was enough to get me going. I had to create the directory structure and make a couple of small tweaks:

Some Questions Along the Way #

I couldn't remember how variable interpolation worked in Javascript, so I asked:

1$ aichat -r coder "how are variables interpolated with strings in javascript?"      

The response:

1const name = 'John';
2const age = 30;
3const message = `My name is ${name} and I am ${age} years old.`;
4console.log(message);

I had everything I needed, but npm test didn't work.

1$ aichat -r coder << 'EOP'
2I want to run `npm test`.  What steps do I need to take before then?
3EOP

The response:

1npm install

The Result #

After I had the basic environment configured, I was able to quickly hack my way to a version of the code that addressed each of the four areas I originally identified. The final S3Uploader.js file is:

 1// S3Uploader.js
 2const { S3 } = require("@aws-sdk/client-s3")
 3
 4module.exports = class S3Uploader {
 5  constructor(bucket) {
 6    const region = process.env.AWS_REGION || process.env.AWS_DEFAULT_REGION || 'us-east-1'
 7    this.bucket = bucket
 8    this.s3 = new S3()
 9    this.public_url = `https://${bucket}.s3.${region}.amazonaws.com/`
10  }
11
12  upload(path, data) {
13    return new Promise((resolve, reject) => {
14      this.s3.putObject({
15        Body: data,
16        Bucket: this.bucket,
17        Key: path
18      }, (err) => {
19        if (err) reject(err)
20        else resolve(this.public_url + path)
21      })
22    })
23  }
24}

References #