Skip to content

Commit

Permalink
Merge pull request #1 from PLG-Works/setup
Browse files Browse the repository at this point in the history
- Generate static HTML files
- Find and Replace custom text in generated files
- AWS S3 upload
  • Loading branch information
bala007 authored Aug 19, 2022
2 parents 79fae29 + 03525ff commit 1c35534
Show file tree
Hide file tree
Showing 6 changed files with 295 additions and 1 deletion.
13 changes: 13 additions & 0 deletions .env.sample
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
GHOST_HOSTED_URL=https://content.yoursite.com/blog
GHOST_STATIC_HOST_URL=https://yoursite.com/blog
S3_BUCKET_NAME=s3-bucket-name
AWS_ACCESS_KEY_ID=access-key-id
AWS_SECRET_ACCESS_KEY=secret-access-key
AWS_DEFAULT_REGION=us-east-1

# CUSTOM_REPLACE_KEYS and CUSTOM_REPLACE_VALUES must be comma seperated values and number of elements should be same.
CUSTOM_REPLACE_KEYS="key_1", "key_2"
CUSTOM_REPLACE_VALUES="value_1", "value_2"

# To replace ld+json data for the root index file use following variable
ROOT_INDEX_JSONLD={"@context":"http://schema.org","@type":"Organization"}
11 changes: 11 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
FROM bash:5.1.16-alpine3.15

RUN apk add --no-cache wget aws-cli

ENV GHOST_STATIC_CONTENT_DIR=/src/content

COPY entrypoint.sh /entrypoint.sh
RUN mkdir -p ${GHOST_STATIC_CONTENT_DIR}
COPY run.sh /src/

ENTRYPOINT ["/entrypoint.sh"]
75 changes: 74 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1 +1,74 @@
# ghost-static-site-generator
# Ghost Static Site Generator

Generate static HTML files for custom ghost hosting and publish them on AWS S3 as static website.
To host the static blog site under a path `/blog` (`https://yourdomain.com/blog`), the ghost needs to be hosted with the same path like `https://content.yourdomain.com/blog`.

If your blog is hosted under `https://content.yourdomain.com` and you want to host the static site under `https://yourdomain.com/blog`, this is not possible. Same applies for the other way around.

You can also replace certain text from the generated static files by passing the following arguments `custom_replace_keys` and `custom_replace_values`. For more details, refer Inputs and Example usage section. It doesn't support the multiline replacement as of now.

Additionally, it also provides the functionality to upload the static HTML files to S3 bucket. To access these files publicly, make it as a public bucket and enable static website hosting.

## Inputs

## `ghost_hosted_url`

**Required** Ghost hosted URL endpoint. (`ex: https://content.yourdomain.com/blog`)

## `ghost_static_host_url`

**Required** URL endpoint where static files needs to be hosted. (`ex: https://yourdomain.com/blog`)

## `custom_replace_keys`

**Optional** Comma separated list of items that needs to be replaced from the items in custom_replace_values at the same index.

## `custom_replace_values`

**Optional** Comma separated associated values for the item in custom_replace_keys.

## `root_index_jsonld`

**Optional** ld+json data for the root index file.

## `s3_bucket_name`

**Optional** S3 bucket name to upload static HTML files.

## `aws_access_key_id`

**Optional** AWS access key Id.

## `aws_secret_access_key`

**Optional** AWS secret access key.

## `aws_region`

**Optional** AWS region.

## Example usage

```yaml
name: Generate Static HTML files
uses: PLG-Works/ghost-static-site-generator@v1
with:
ghost_hosted_url: "https://content.yourdomain.com/blog"
ghost_static_host_url: "https://yourdomain.com/blog"
s3_bucket_name: "your-s3-bucket-name"
aws_access_key_id: ${{ secrets.AWS_ACCESS_KEY_ID }} # Accessing it from the gihub secrets
aws_secret_access_key: ${{ secrets.AWS_SECRET_ACCESS_KEY }} # Accessing it from the gihub secrets
aws_region: "us-east-1"
custom_replace_keys: "key_1, key_2, key_n"
custom_replace_values: "value_1, value_2, value_n"
```
### _Locally build and run with docker_
```bash
docker build -t ghost-ssg .
docker run -it --env-file .env.sample ghost-ssg
```

> Make appropriate changes to the `.env.sample` file.
> To Persist the generated HTML files in local (host system) directory, use bind mount option with `docker run` command, For example: `-v /path/to/local/dir:/src/content`
43 changes: 43 additions & 0 deletions action.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
name: "Ghost Static Site Generator"
description: "Generate static HTML files for the custom ghost hosting and publishing them on S3 static website bucket"
inputs:
ghost_hosted_url:
description: "Ghost hosted URL endpoint"
required: true
ghost_static_host_url:
description: "URL endpoint where static files needs to be hosted"
required: true
custom_replace_keys:
description: "Comma separated list of items that needs to be replaced from the items in custom_replace_values at the same index"
required: false
custom_replace_values:
description: "Comma separated associated values for the item in custom_replace_keys"
required: false
root_index_jsonld:
description: "ld+json data for the root index file"
required: false
s3_bucket_name:
description: "S3 bucket name to upload static HTML files"
required: false
aws_access_key_id:
description: "AWS access key Id"
required: false
aws_secret_access_key:
description: "AWS secret access key"
required: false
aws_region:
description: "AWS region"
required: false
runs:
using: "docker"
image: "Dockerfile"
args:
- ${{ inputs.ghost_hosted_url }}
- ${{ inputs.ghost_static_host_url }}
- ${{ inputs.custom_replace_keys }}
- ${{ inputs.custom_replace_values }}
- ${{ inputs.root_index_jsonld }}
- ${{ inputs.s3_bucket_name }}
- ${{ inputs.aws_access_key_id }}
- ${{ inputs.aws_secret_access_key }}
- ${{ inputs.aws_region }}
15 changes: 15 additions & 0 deletions entrypoint.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
#!/usr/bin/env bash

if [[ $# -gt 0 && ${GITHUB_ACTIONS} -eq true ]]; then
export GHOST_HOSTED_URL=${INPUT_GHOST_HOSTED_URL}
export GHOST_STATIC_HOST_URL=${INPUT_GHOST_STATIC_HOST_URL}
export CUSTOM_REPLACE_KEYS=${INPUT_CUSTOM_REPLACE_KEYS}
export CUSTOM_REPLACE_VALUES=${INPUT_CUSTOM_REPLACE_VALUES}
export ROOT_INDEX_JSONLD=${INPUT_ROOT_INDEX_JSONLD}
export S3_BUCKET_NAME=${INPUT_S3_BUCKET_NAME}
export AWS_ACCESS_KEY_ID=${INPUT_AWS_ACCESS_KEY_ID}
export AWS_SECRET_ACCESS_KEY=${INPUT_AWS_SECRET_ACCESS_KEY}
export AWS_DEFAULT_REGION=${INPUT_AWS_REGION}
fi

/src/run.sh
139 changes: 139 additions & 0 deletions run.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,139 @@
#!/usr/bin/env bash

# Following ENV variables are required.
# - GHOST_STATIC_CONTENT_DIR (set in Dockerfile)
# - GHOST_HOSTED_URL (something like "https://content.yoursite.com/blog")
# - GHOST_STATIC_HOST_URL (something like "https://yoursite.com/blog")
# Optional ENV variables:
# - CUSTOM_REPLACE_KEYS (comma separated list of items that needs to be replaced from the items in CUSTOM_REPLACE_VALUES at the same index)
# - CUSTOM_REPLACE_VALUES (associated values for the item in CUSTOM_REPLACE_KEYS)
# - ROOT_INDEX_JSONLD (LD+JSON data that needs to be replaced in root index.html page)
# - S3_BUCKET_NAME (S3 bucket name to upload static HTML files)
# - AWS_ACCESS_KEY_ID
# - AWS_SECRET_ACCESS_KEY
# - AWS_DEFAULT_REGION

GHOST_HOSTED_DOMAIN_WITH_PATH=$(echo ${GHOST_HOSTED_URL} | cut -d '/' -f 3-)
GHOST_HOSTED_BLOG_PATH=$(echo ${GHOST_HOSTED_URL} | cut -d '/' -f 4-)
GHOST_STATIC_BLOG_DOMAIN=$(echo ${GHOST_STATIC_HOST_URL} | cut -d '/' -f 3- | cut -d '/' -f 1)
GHOST_STATIC_HOST=$(echo ${GHOST_STATIC_HOST_URL} | cut -d '/' -f 1-3)
GHOST_STATIC_BLOG_PATH=$(echo ${GHOST_STATIC_HOST_URL} | cut -d '/' -f 4-)

if [[ -z "${GHOST_HOSTED_URL}" ]]; then
echo "Error: GHOST_HOSTED_URL is mandatory"
exit 1
fi

if [[ -z "${GHOST_STATIC_HOST_URL}" ]]; then
echo "Error: GHOST_STATIC_HOST_URL is mandatory"
exit 1
fi

if [[ "${GHOST_HOSTED_BLOG_PATH}" != "${GHOST_STATIC_BLOG_PATH}" ]]; then
echo "Error: Path mismatch. The ${GHOST_HOSTED_URL} and ${GHOST_STATIC_HOST_URL} should end with same path suffix."
exit 1
fi

echo "###########################################################"
echo "GHOST_HOSTED_URL : ${GHOST_HOSTED_URL}"
echo "GHOST_HOSTED_BLOG_PATH : ${GHOST_HOSTED_BLOG_PATH}"
echo "GHOST_STATIC_HOST_URL : ${GHOST_STATIC_HOST_URL}"
echo "GHOST_STATIC_BLOG_DOMAIN : ${GHOST_STATIC_BLOG_DOMAIN}"
echo "GHOST_STATIC_HOST : ${GHOST_STATIC_HOST}"
echo "GHOST_STATIC_BLOG_PATH : ${GHOST_STATIC_BLOG_PATH}"
echo "###########################################################"

blog_dir="${GHOST_STATIC_CONTENT_DIR}"
s3_blog_path="s3://${S3_BUCKET_NAME}"
if [[ ! -z "${GHOST_STATIC_BLOG_PATH}" ]]; then
blog_dir="${blog_dir}/${GHOST_STATIC_BLOG_PATH}"
s3_blog_path="${s3_blog_path}/${GHOST_STATIC_BLOG_PATH}"
fi

if [[ -d ${blog_dir} ]]; then
rm -rf ${blog_dir}/*
fi

echo " "
echo "***** Started fetching static HTML files *****"
WGET_PATHS=("/" "/sitemap.xml" "/sitemap.xsl" "/sitemap-authors.xml" "/sitemap-pages.xml" "/sitemap-posts.xml" "/sitemap-tags.xml" "/404/" "/public/ghost.css" "/public/ghost.min.css" "/public/404-ghost.png" "/public/[email protected]")
for path in ${WGET_PATHS[@]}; do
url="${GHOST_HOSTED_URL}${path}"
echo "Generating static HTML files for url : ${url}"
wget --mirror -p --no-host-directories --timestamping --restrict-file-name=unix --page-requisites --content-on-error --no-parent --directory-prefix ${GHOST_STATIC_CONTENT_DIR} ${url}
done
echo "***** Fetch complete for static HTML files *****"

find ${GHOST_STATIC_CONTENT_DIR} -name '*?v=*' -exec bash -c 'mv $0 ${0/?v=*/}' {} \;

echo " "
echo "***** Replace text with custom text started *****"
declare -A REPLACE_CONTENT=(
["${GHOST_HOSTED_URL}"]="${GHOST_STATIC_HOST_URL}"
["\"url\": \"${GHOST_STATIC_HOST_URL}/\""]="\"url\": \"${GHOST_STATIC_HOST}/\""
["\"@type\": \"WebSite\""]="\"@type\": \"WebPage\""
)

if [[ ! -z "${GHOST_STATIC_BLOG_PATH}" ]]; then
REPLACE_CONTENT["${GHOST_STATIC_BLOG_DOMAIN}/${GHOST_STATIC_BLOG_PATH}\""]="${GHOST_STATIC_BLOG_DOMAIN}/${GHOST_STATIC_BLOG_PATH}/\""
fi

for rstring in "${!REPLACE_CONTENT[@]}"; do
echo "Replace: ${rstring} -> ${REPLACE_CONTENT[${rstring}]}"
find ${blog_dir} -type f -print0 | xargs -0 sed -i'' "s,${rstring},${REPLACE_CONTENT[${rstring}]},g"
if [[ $? -ne 0 ]]; then
echo "Error: Text replace failed"
exit 1
fi
done

IFS=',' read -r -a KEYS <<<"$CUSTOM_REPLACE_KEYS"
IFS=',' read -r -a VALUES <<<"$CUSTOM_REPLACE_VALUES"

if [[ ${#KEYS[@]} -ne ${#VALUES[@]} ]]; then
echo "Error: Invalid environment variables CUSTOM_REPLACE_*. The number of comma separated items should be same in both the ENV variables."
exit 1
fi

for i in "${!KEYS[@]}"; do
key=$(echo -e ${KEYS[$i]} | sed -e 's/^[[:space:]]*//' | sed -e 's/[[:space:]]*$//')
val=$(echo -e ${VALUES[$i]} | sed -e 's/^[[:space:]]*//' | sed -e 's/[[:space:]]*$//')
echo "Custom Replace: ${key} -> ${val}"
find ${blog_dir} -type f -print0 | xargs -0 sed -i'' "s,${key},${val},g"
if [[ $? -ne 0 ]]; then
echo "Error: Text replace failed"
exit 1
fi
done
echo "***** Text replace completed *****"

if [[ ! -z "${ROOT_INDEX_JSONLD}" ]]; then
echo " "
echo "***** Replace ld+json data in index.html *****"

sed "/<script type=\"application\/ld+json/,/<\/script>/c\
<script type=\"application/ld+json\">${ROOT_INDEX_JSONLD}</script>" ${blog_dir}/index.html >${blog_dir}/_index.html
mv -f ${blog_dir}/_index.html ${blog_dir}/index.html

echo "***** ld+json data replaced in index.html *****"
fi

if [[ ! -z ${S3_BUCKET_NAME} ]]; then
echo " "
echo "***** Started uploading files to S3 *****"
aws s3 sync ${blog_dir} ${s3_blog_path} --exclude 'public/*' --exclude 'assets/*' --acl public-read --cache-control "no-store, no-cache, max-age=0, must-revalidate, post-check=0, pre-check=0" --delete
if [[ $? -ne 0 ]]; then
echo "Error: S3 upload error"
exit 1
fi
aws s3 sync ${blog_dir}/public ${s3_blog_path}/public --acl public-read --cache-control "public, max-age=604800, must-revalidate" --delete
aws s3 sync ${blog_dir}/assets ${s3_blog_path}/assets --acl public-read --cache-control "public, max-age=604800, must-revalidate" --delete
echo "***** S3 upload complete *****"
else
echo " "
echo "If you want to upload the static site files to S3, provide following ENV variables: S3_BUCKET_NAME, AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_DEFAULT_REGION"
fi

echo " "
echo "***** FINISHED SUCCESFULLY *****"

0 comments on commit 1c35534

Please sign in to comment.