I'm not sure what point you're trying to make. Here it is in C, so you can run it on you computer in 1995? Because servers could make decisions in 1995.
int main() { int s = socket(AF_INET, SOCK_STREAM, 0); setsockopt(s, SOL_SOCKET, SO_REUSEADDR, &(int){1}, sizeof(int));
struct sockaddr_in addr = { AF_INET, htons(8080), .sin_addr.s_addr = INADDR_ANY };
bind(s, (struct sockaddr*)&addr, sizeof(addr));
listen(s, 10);
printf("Listening on :8080\n");
while (1) {
int c = accept(s, NULL, NULL);
char req[1024] = {0};
read(c, req, sizeof(req) - 1);
time_t now = time(NULL);
int tuesday = localtime(&now)->tm_wday == 2;
const char *status = tuesday ? "404 Not Found" : "200 OK";
const char *body = tuesday ? "Not Found (it's Tuesday)" : "Hello from 1995!";
char resp[256];
snprintf(resp, sizeof(resp),
"HTTP/1.1 %s\r\n"
"Content-Length: %zu\r\n"
"Connection: close\r\n\r\n%s",
status, strlen(body), body);
write(c, resp, strlen(resp));
close(c);
}
}
A post claimed CGI led to bad standards around query parameter formatting and parsing. I was merely pointing out that, prior to the advent of CGI, if you wanted to actually do anything with those parameters on the server, you had to extend whatever primitive HTTP server you were running, write some custom code and invent your own “standard”. There were no server side frameworks or standards.