Skip to content
Projects
Groups
Snippets
Help
This project
Loading...
Sign in / Register
Toggle navigation
L
lynx
Project
Overview
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Walter Huang
lynx
Commits
ea87652c
Commit
ea87652c
authored
Sep 10, 2024
by
Walter Huang
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
init: project
parents
Hide whitespace changes
Inline
Side-by-side
Showing
3 changed files
with
113 additions
and
0 deletions
+113
-0
README.md
README.md
+8
-0
http.ts
http.ts
+51
-0
requestOpenAI.ts
requestOpenAI.ts
+54
-0
No files found.
README.md
0 → 100644
View file @
ea87652c
# Bun Starter
Quickly get started with
[
Bun
](
https://bun.sh/
)
using this starter! Bun is a fast all-in-one JavaScript runtime with a focus on speed.
-
This starter starts a Bun HTTP server on
[
localhost:3000
](
http://localhost:3000
)
.
-
You can check
[
http.ts
](
./http.ts
)
to see how the server is started.
-
If you want to upgrade Bun, you can change
`BUN_VERSION`
in the
[
Dockerfile
](
./.devcontainer/Dockerfile
)
.
\ No newline at end of file
http.ts
0 → 100644
View file @
ea87652c
import
{
generateChatCompletions
}
from
"./requestOpenAI"
;
Bun
.
serve
({
port
:
3000
,
async
fetch
(
req
)
{
const
url
=
new
URL
(
req
.
url
);
// 处理 OPTIONS 预检请求
if
(
req
.
method
===
"OPTIONS"
)
{
return
new
Response
(
null
,
{
status
:
204
,
// No Content
headers
:
{
"Access-Control-Allow-Origin"
:
"*"
,
"Access-Control-Allow-Methods"
:
"GET, POST, OPTIONS"
,
"Access-Control-Allow-Headers"
:
"Content-Type"
,
"Access-Control-Max-Age"
:
"86400"
,
},
});
}
// 处理实际请求
if
(
url
.
pathname
===
"/api/data"
&&
req
.
method
===
"POST"
)
{
try
{
const
body
=
await
req
.
json
();
const
{
prompt
,
executeTimes
,
params
}
=
body
;
const
data
=
await
generateChatCompletions
(
prompt
,
executeTimes
,
params
);
return
new
Response
(
JSON
.
stringify
({
data
}),
{
headers
:
{
"Content-Type"
:
"application/json"
,
"Access-Control-Allow-Origin"
:
"*"
,
},
});
}
catch
(
error
)
{
console
.
error
(
"Error generating chat completions:"
,
error
);
return
new
Response
(
"Internal Server Error"
,
{
status
:
500
});
}
}
return
new
Response
(
"Hello from Bun server!"
,
{
headers
:
{
"Access-Control-Allow-Origin"
:
"*"
,
},
});
},
});
requestOpenAI.ts
0 → 100644
View file @
ea87652c
export
async
function
generateChatCompletions
(
prompt
,
executeTimes
,
params
)
{
const
headers
=
{
"Content-Type"
:
"application/json"
,
Authorization
:
`Bearer
${
process
.
env
.
API_KEY
}
`
,
// Replace with your OpenAI API key
};
const
endpoint
=
"https://api.openai.com/v1/chat/completions?api-version=2024-02-01"
;
const
defaultConfig
=
{
model
:
"gpt-3.5-turbo-0613"
,
max_tokens
:
1200
,
temperature
:
0.9
,
top_p
:
1.0
,
n
:
1
,
};
function
fetchWithTimeout
(
url
,
options
=
{},
timeout
=
60
)
{
// 创建一个定时 Promise,模拟超时
const
timeoutPromise
=
new
Promise
((
_
,
reject
)
=>
setTimeout
(()
=>
reject
(
new
Error
(
"Request timed out"
)),
timeout
*
1000
)
);
// 使用 Promise.race 来并行处理 fetch 和 timeout
return
Promise
.
race
([
fetch
(
url
,
options
),
timeoutPromise
]);
}
const
sendRequest
=
async
()
=>
{
const
payload
=
{
messages
:
[{
role
:
"user"
,
content
:
prompt
}],
...
defaultConfig
,
...
params
,
};
try
{
const
response
=
await
fetchWithTimeout
(
endpoint
,
{
method
:
"POST"
,
headers
,
body
:
JSON
.
stringify
(
payload
),
});
const
data
=
await
response
.
json
();
return
data
.
choices
[
0
]?.
message
?.
content
||
""
;
// 处理可能的错误
}
catch
(
error
)
{
console
.
error
(
"Error fetching completion:"
,
error
);
return
""
;
// 处理网络错误或其他异常
}
};
const
requests
=
Array
.
from
({
length
:
executeTimes
},
sendRequest
);
const
newValues
=
await
Promise
.
all
(
requests
);
return
newValues
;
}
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment