Export GCP Stackdriver Log With Filebeat
This is a bash script to configure GCP project to export logs by creating a Pub/Sub sink topic and let filebeat to subscribe to that sink topic by the filebeat google cloud module.
#!/bin/sh # author: me 😃 # $ bash gcloud-admin.sh -h Required parameters: # -id|–project-id: gcloud project id # -svs|–svs-account: gcloud service account name to collect logs # Optional parameters: # -h|–help: Print this message readonly ARGS="$@" readonly dependencies=( "gcloud" ) processArgs(){ while [[ "$#" -gt 0 ]]; do key="$1" case "$key" in -h|–help) PRINT_HELP=true shift ;; -id|–project-id) PROJECT_ID="$2" shift ;; -svs|–svs-account) SVS_ACCOUNT="$2" shift ;; esac shift done } checkDependencies() { local unmet_dependencies=false for dependency in "${dependencies[@]}" ; do command -v "${dependency}" >/dev/null 2>&1 || { echo >&2 "${dependency}required"; unmet_dependencies=true } done if [ "${unmet_dependencies}" = true ] ; then echo "Please install unmet dependencies above before running.
jq examples
jq is a command-line JSON processor to parse json format data. You can find the detailed documentation here. And you can try to play it online at jqplay.org. I am listing out few jq command line examples that I found quite useful for day-to-day work 😃.
Let’s try jq with AWS resources api.
// to parse and extract json data $ curl -s \ https://pricing.us-east-1.amazonaws.com/offers/v1.0/aws/index.json \ | jq .offers.AmazonEC2 { "offerCode": "AmazonEC2", "versionIndexUrl": "/offers/v1.