Hey, it's finally happened. I've decided to write a blog post. And if you're reading this, I've also finished one. I have wanted to do this for a long time, but could never find the motivation to start. But you know what they say: anger is the best motivator. They do say that, right?
Some context that's in the background
We're going on a journey, you and I. But first, we need to set the scene. Imagine we're working for $COMPANY and one of our Next.js services did an oopsie. This being Next.js, we of course have no idea what actually happened since the default logging is only enabled during development.
Our quest is to go in and setup some production ready logging. It's not going to be easy, but then again, nothing ever is.
Middleware? Middle of nowhere!
The first step of our journey is the middleware. The documentation even states this:
Middleware executes before routes are rendered. It's particularly useful for implementing custom server-side logic like authentication, logging, or handling redirects.
Alright, looks simple enough. Time to pick a logging library. I went with pino since we have used it before. Anything is an upgrade over console.log
anyways. We'll get this done before lunch.
Let's set up a basic middleware:
// middleware.ts
import { NextResponse, NextRequest } from "next/server";
export async function middleware(request: NextRequest) {
return new NextResponse.next({
request: request,
headers: request.headers,
// status: 200,
// statusText: 'OK'
});
}
export const config = {
matcher: "/:path*",
};
I think we already have a problem here. You can pass a grand total of 4 parameters from your middleware. The only thing that actually affects the invoked route is the headers
. Let's not skip over the fact that you can't have multiple middlewares or chain them either. How do you fuck this up so bad? We've had middlewares since at least the early 2010s when Express came out.
Anyways, we're smart and modern Node.js has some pretty nifty tools. Let's reach for AsyncLocalStorage
.
// app/logger.ts
import { AsyncLocalStorage } from "async_hooks";
import { Logger, pino } from "pino";
const loggerInstance = pino({
// Whatever config we need
level: process.env.LOG_LEVEL ?? "trace",
});
export const LoggerStorage = new AsyncLocalStorage<Logger>();
export function logger(): Logger | null {
return LoggerStorage.getStore() ?? null;
}
export function requestLogger(): Logger {
return loggerInstance.child({ requestId: crypto.randomUUID() });
}
// middleware.ts
export async function middleware(request: NextRequest) {
LoggerStorage.enterWith(requestLogger());
logger()?.debug({ url: request.url }, "Started processing request!");
return NextResponse.next();
}
Whew, hard work done. Let's test it out. Visit localhost:3000 and we see this:
{ requestId: 'ec7718fa-b1a2-473e-b2e2-8f51188efa8f' } { url: 'http://localhost:3000/' } 'Started processing request!'
GET / 200 in 71ms
{ requestId: '09b526b1-68f4-4e90-971f-b0bc52ad167c' } { url: 'http://localhost:3000/next.svg' } 'Started processing request!'
{ requestId: '481dd2ff-e900-4985-ae15-0b0a1eb5923f' } { url: 'http://localhost:3000/vercel.svg' } 'Started processing request!'
{ requestId: 'e7b29301-171c-4c91-af25-771471502ee4' } { url: 'http://localhost:3000/file.svg' } 'Started processing request!'
{ requestId: '13766de3-dd00-42ce-808a-ac072dcfd4c6' } { url: 'http://localhost:3000/window.svg' } 'Started processing request!'
{ requestId: '317e054c-1a9a-4dd8-ba21-4c0201fbeada' } { url: 'http://localhost:3000/globe.svg' } 'Started processing request!'
I don't know if you've ever used pino before, but this is wrong. Can you figure out why?
Unlike Next.js I won't keep you waiting in limbo. This is the browser output. Why? Well, it's because the default Next.js middleware runtime is edge
. We can of course switch to the nodejs
runtime which should work. Except, of course, it might not.
I've tried it on a fresh Next.js project and it does work. But it didn't when I was trying it out on our actual project. I swear I'm not crazy. Anyways, this isn't the main issue. We're slowly getting there.
Paging the local mental asylum
Logging in the middleware is cool and all, but it's not where the bulk of the magic happens. For that, we need to log in pages and layouts. Let's try it out.
// app/page.tsx
export default function Home() {
logger()?.info("Logging from the page!");
return <div>Real simple website!</div>
}
Refresh the page and we get this:
✓ Compiled / in 16ms
GET / 200 in 142ms
That's it? That's it. Nothing. Nada. Zilch.
For posterity's sake, this is what it's supposed to look like:
✓ Compiled / in 2.2s
[11:38:59.259] INFO (12599): Logging from the page!
requestId: "2ddef9cf-6fee-4d1d-8b1e-6bb16a3e636b"
GET / 200 in 2520ms
Ok,this is getting a bit long, so I'll get to the point. The logger
function returns null
. Why? I'm not entirely sure, but it seems like rendering is not executed in the same async context as the middleware.
What's the solution then? You're not going to believe this. Remember how the only value you can pass from the middleware is headers
? Yeah. That's what we need to use.
The following is for people with strong stomachs:
// app/log/serverLogger.ts
import { pino } from "pino";
export const loggerInstance = pino({
// Whatever config we need
level: process.env.LOG_LEVEL ?? "info",
});
// app/log/middleware.ts
// Yes, we need to split up the loggers ...
// Mostly the same as before
import { loggerInstance } from "./serverLogger";
export function requestLogger(requestId: string): Logger {
return loggerInstance.child({ requestId });
}
// app/log/server.ts
import { headers } from "next/headers";
import { loggerInstance } from "./serverLogger";
import { Logger } from "pino";
import { NextRequest } from "next/server";
const REQUEST_ID_HEADER = "dominik-request-id";
export function requestHeaders(
request: NextRequest,
requestId: string,
): Headers {
const head = new Headers(request.headers);
head.set(REQUEST_ID_HEADER, requestId);
return head;
}
// Yeah, this has to be async ...
export async function logger(): Promise<Logger> {
const hdrs = await headers();
const requestId = hdrs.get(REQUEST_ID_HEADER);
return loggerInstance.child({ requestId });
}
// middleware.ts
import { logger, LoggerStorage, requestLogger } from "./app/log/middleware";
import { requestHeaders } from "./app/log/server";
export async function middleware(request: NextRequest) {
const requestId = crypto.randomUUID();
LoggerStorage.enterWith(requestLogger(requestId));
logger()?.debug({ url: request.url }, "Started processing request!");
return NextResponse.next({ headers: requestHeaders(request, requestId) });
}
// app/page.tsx
export default async function Home() {
(await logger())?.info("Logging from the page!");
// ...
}
Isn't it beautiful? I especially appreciate how it's now possible to import the middleware logging code from the server. Which of course won't work. Or import the server logging code from the middleware. Which also won't work. Better not mess up. Also, we haven't even touched upon logging in client components, which despite the name also run on the server. Yeah, that's a third split.
Congratulations, you're being coddled. Please do not resist.
Listen. I wanted to apologize, because I've led you into this trap. You see, I have already fallen into it several times before. A middleware system can be pretty useful when designed correctly and I wanted you to see what it looks like when it's not. The reason for writing this blog post actually started here.
I think every one of us has reached a point in their lives where they've had enough. For me, it was right here. Fuck it, let's use a custom server.
A custom Next.js server allows you to programmatically start a server for custom patterns. The majority of the time, you will not need this approach. However, it's available if you need to eject.
Let's take a look at the example from the documentation:
import { createServer } from 'http'
import { parse } from 'url'
import next from 'next'
const port = parseInt(process.env.PORT || '3000', 10)
const dev = process.env.NODE_ENV !== 'production'
const app = next({ dev })
const handle = app.getRequestHandler()
app.prepare().then(() => {
createServer((req, res) => {
const parsedUrl = parse(req.url!, true)
handle(req, res, parsedUrl)
}).listen(port)
console.log(
`> Server listening at http://localhost:${port} as ${
dev ? 'development' : process.env.NODE_ENV
}`
)
})
Note that once again, handle
doesn't really take any parameters. Only the request URL and the raw request and response.
Anyways, we still have AsyncLocalStorage
so this doesn't concern us too much. Let's modify the example a bit.
// app/logger.ts
// Reverted back to our AsyncLocalStorage variaton
import { pino, Logger } from "pino";
import { AsyncLocalStorage } from "async_hooks";
const loggerInstance = pino({
// Whatever config we need
level: process.env.LOG_LEVEL ?? "info",
});
export const LoggerStorage = new AsyncLocalStorage<Logger>();
export function logger(): Logger | null {
return LoggerStorage.getStore() ?? null;
}
export function requestLogger(): Logger {
return loggerInstance.child({ requestId: crypto.randomUUID() });
}
// server.ts
import { logger, LoggerStorage, requestLogger } from "./app/logger";
app.prepare().then(() => {
createServer(async (req, res) => {
// This is new
LoggerStorage.enterWith(requestLogger());
logger()?.info({}, "Logging from server!");
const parsedUrl = parse(req.url!, true);
await handle(req, res, parsedUrl);
}).listen(port);
});
// middleware.ts
import { logger } from "./app/logger";
export async function middleware(request: NextRequest) {
logger()?.info({}, "Logging from middleware!");
return NextResponse.next();
}
// app/page.tsx
import { logger } from "./logger";
export default async function Home() {
logger()?.info("Logging from the page!");
// ...
}
Ok, let's test it out. Refresh the browser and ...
> Server listening at http://localhost:3000 as development
[12:29:52.183] INFO (19938): Logging from server!
requestId: "2ffab9a2-7e15-4188-8959-a7822592108f"
✓ Compiled /middleware in 388ms (151 modules)
○ Compiling / ...
✓ Compiled / in 676ms (769 modules)
That's it. Are you fucking kidding me right now? What the fuck?
Now, you might be thinking that this is just not how AsyncLocalStorage
works. And you might be right. But I would like to point out that headers()
and cookies()
use AsyncLocalStorage
. This is a power that the Next.js devs have that we don't.
As far as I can tell there are only two ways to pass information from a middleware to a page.
Headers
NextResponse.redirect
/NextResponse.rewrite
to a route with extra params (eg. /[requestId]/page.tsx)
As you might have noticed, neither of these are very pleasant to use in this case.
You are being coddled. The Next.js devs have a vision and it's either their way or the highway. Note that if it was just the middleware, I wouldn't be sitting here, wasting away my weekend, ranting about a React framework. Believe it or not, I've got better things to do. It's constant pain you encounter daily when working with Next.js.
Vercel can do better
What's infuriating about this example is that Vercel can very much do better. I don't want to sing too many praises at Svelte(Kit) because I have some misgivings about its recent direction, but it's so much better than Next.js. Let's look at their middleware docs:
handle - This function runs every time the SvelteKit server receives a request [...] This allows you to modify response headers or bodies, or bypass SvelteKit entirely (for implementing routes programmatically, for example).
Looking good so far.
locals - To add custom data to the request, which is passed to handlers in+server.js
and serverload
functions, populate theevent.locals
object, as shown below.
I'm crying tears of joy right now. You can also stuff real objects/classes in there. Like a logger for instance.
You can define multiple handle functions and execute them with sequence.
This is what real engineering looks like. SvelteKit is a Vercel product. How is the flagship offering worse than what is essentially a side project. What the hell?
Scientists discover a new super massive black hole at https://github.com/vercel/next.js/issues
I don't have anything else to add, but while I'm here I feel like I have to talk about the GitHub issue tracker. This is perhaps the crown jewel of the dumpster fire that is Next.js. It's a place where hopes and issues come to die. The mean response time for a bug report is never. I've made it a sport to search the issue tracker/discussion for problems I'm currently facing and bet on how many years it takes to even get a response from a Next.js dev.
You think I'm joking? There are hundreds of issues with as many 👍 emojis with no official response for years. And when you finally get a response, it's to tell you that what you're doing is wrong and a solution to your real problems is on the way. Then they proceed to keep the "solution" in canary for years on end.
I personally reported two issues a year ago. Keep in mind that to have a valid bug report, you need a reproduction.
So, what do you get for taking the time to make a minimal reproduction? That's right. Complete silence.
I would have reported about a dozen other issues I have encountered, but after this experience I gave up.
Honestly, I don't even know if the issues are still valid.
Have we learned anything?
I don't know. For me, personally, I don't want to use Next.js anymore. You might think that this is just a singular issue and I'm overreacting. But there's bugs and edge cases around every corner. How did they manage to make TypeScript compile slower than Rust? Why make a distinction between code running on client and server and then not give me any tools to take advantage of that? Why? Why? Why?
I don't think I quite have enough pull to move us out of Next.js land. But, I think I will voice my opinion if we end up writing another app. We'll see if the grass is any greener on the other side.