fix(web): add robots/sitemap metadata and bypass middleware#507
fix(web): add robots/sitemap metadata and bypass middleware#507
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
1 Skipped Deployment
|
|
Warning Rate limit exceeded
To keep reviews running without waiting, you can enable usage-based add-on for your organization. This allows additional reviews beyond the hourly cap. Account admins can enable it under billing. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: Repository UI Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (3)
✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Review rate limit: 0/1 reviews remaining, refill in 34 minutes and 56 seconds.Comment |
|
본문에 배경/원인, 환경별 robots/sitemap 정책, 배포 후 확인 포인트까지 추가했습니다.\n\n배포 완료되면 아래 2개 endpoint 상태만 확인 부탁드립니다.\n- https://www.solid-connection.com/robots.txt\n- https://www.solid-connection.com/sitemap.xml |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 90efd042ce
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
| priority: 1, | ||
| }, | ||
| { | ||
| url: `${siteUrl}/mentor`, |
There was a problem hiding this comment.
Remove auth-gated URLs from sitemap entries
The sitemap now includes /mentor, but this route is still listed in loginNeedPages in middleware.ts and redirects unauthenticated requests to /login. Search crawlers will therefore hit a redirect (or login wall) for a URL advertised as indexable, which can produce sitemap/indexing errors and weakens crawl quality. Only publicly accessible canonical URLs should be emitted here.
Useful? React with 👍 / 👎.
90efd04 to
b835988
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: b8359888b3
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
| export const config = { | ||
| matcher: [ | ||
| "/((?!api|_next/static|_next/image|static/chunks|images|assets|favicon.ico|sw.js|viewer|fonts|.*\\.splat).*)", | ||
| "/((?!api|_next/static|_next/image|static/chunks|images|assets|favicon.ico|robots.txt|sitemap.xml|sw.js|viewer|fonts|.*\\.splat).*)", |
There was a problem hiding this comment.
Escape literal dots in matcher exclusions
The new matcher exclusions robots.txt and sitemap.xml are interpreted as regex fragments, so . matches any character and the pattern is prefix-based; this now skips middleware for paths like /sitemap.xml.php or /robotsXtxt, not just the two metadata files. In those cases, isProbePath and auth redirect checks are bypassed entirely, which regresses the probe-blocking behavior this middleware currently enforces. Please make these exclusions exact/literal (e.g., escaped dots and bounded matching) so only /robots.txt and /sitemap.xml are exempted.
Useful? React with 👍 / 👎.
배경 / 원인
robots.txt에 의해 차단됨경고가 발생www.solid-connection.com)에서 아래가 404 응답이었음/robots.txt/sitemap.xml변경 사항
apps/web/src/app/robots.ts추가Allow: /+Sitemap노출Disallow: /apps/web/src/app/sitemap.ts추가apps/web/src/middleware.tsrobots.txt,sitemap.xml추가기대 효과
검증
pnpm --filter web exec tsc -p tsconfig.json --noEmitlinttypechecknext build배포 후 확인 포인트
https://www.solid-connection.com/robots.txt->200+Allow: /https://www.solid-connection.com/sitemap.xml->200Validate Fix실행