1
1
mirror of https://github.com/go-gitea/gitea synced 2025-07-23 02:38:35 +00:00

Docusaurus-ify (#26051)

This PR cleans up the docs in a way to make them simpler to ingest by
our [docs repo](https://gitea.com/gitea/gitea-docusaurus).

1. It includes all of the sed invocations our ingestion did, removing
the need to do it at build time.
2. It replaces the shortcode variable replacement method with
`@variable@` style, simply for easier sed invocations when required.
3. It removes unused files and moves the docs up a level as cleanup.

---------

Signed-off-by: jolheiser <john.olheiser@gmail.com>
This commit is contained in:
John Olheiser
2023-07-25 23:53:13 -05:00
committed by GitHub
parent 5dc37ef97a
commit bd4c7ce578
281 changed files with 794 additions and 2157 deletions

View File

@@ -0,0 +1,39 @@
---
date: "2023-05-23T09:00:00+08:00"
title: "搜索引擎索引"
slug: "search-engines-indexation"
sidebar_position: 60
toc: false
draft: false
aliases:
- /zh-cn/search-engines-indexation
menu:
sidebar:
parent: "administration"
name: "搜索引擎索引"
sidebar_position: 60
identifier: "search-engines-indexation"
---
# Gitea 安装的搜索引擎索引
默认情况下,您的 Gitea 安装将被搜索引擎索引。
如果您不希望您的仓库对搜索引擎可见,请进一步阅读。
## 使用 robots.txt 阻止搜索引擎索引
为了使 Gitea 为顶级安装提供自定义的`robots.txt`(默认为空的 404请在[`custom`文件夹或`CustomPath`]administration/customizing-gitea.md中创建一个名为 `robots.txt` 的文件。
有关如何配置 `robots.txt` 的示例,请参考 [https://moz.com/learn/seo/robotstxt](https://moz.com/learn/seo/robotstxt)。
```txt
User-agent: *
Disallow: /
```
如果您将Gitea安装在子目录中则需要在顶级目录中创建或编辑 `robots.txt`
```txt
User-agent: *
Disallow: /gitea/
```