Export GCP Stackdriver Log With Filebeat
This is a bash script to configure GCP project to export logs by creating a Pub/Sub sink topic and let filebeat to subscribe to that sink topic by the filebeat google cloud module.
#!/bin/sh # author: me 😃 # $ bash gcloud-admin.sh -h Required parameters: # -id|–project-id: gcloud project id # -svs|–svs-account: gcloud service account name to collect logs # Optional parameters: # -h|–help: Print this message readonly ARGS="$@" readonly dependencies=( "gcloud" ) processArgs(){ while [[ "$#" -gt 0 ]]; do key="$1" case "$key" in -h|–help) PRINT_HELP=true shift ;; -id|–project-id) PROJECT_ID="$2" shift ;; -svs|–svs-account) SVS_ACCOUNT="$2" shift ;; esac shift done } checkDependencies() { local unmet_dependencies=false for dependency in "${dependencies[@]}" ; do command -v "${dependency}" >/dev/null 2>&1 || { echo >&2 "${dependency}required"; unmet_dependencies=true } done if [ "${unmet_dependencies}" = true ] ; then echo "Please install unmet dependencies above before running.
AWS | Boto3 | Python
This is an example about how to create your own python boto3 class and use it in your day-to-day work 😃. Please feel free to 👉📱message my twilio bot +447479275693. I will come back to you shortly 😃.
import boto3 import os """how to use this classimport aws_modules.get_all_sg_rulessg_rule = aws_modules.get_all_sg_rules.sg(aws_account) # passing aws_account value to retrive all sg rulessg_rule_result = sg_rule.getSgRules()""" class sg: def init(req, aws_account): req.aws_account = aws_account def getSgRules(req): try: os.
amazon s3
aws s3 cli is great! You can easily move your local files to your aws s3 buckets. However, sometimes it is not that easy to do simple tasks - like copy files where there is a whitespace in the file name, delete all versions of all files in a versioned s3 bucket and the difference between aws s3 sync and aws s3 cp –recursive .
escape the whitespace in your file name When you have a long list of files that you need to upload to s3 bucket, it will be easy for you to loop it through if you have nice filenames that there are no whitespace.
Lambda Logshipper
How can you easily move your Cloudwatch logstream to another platform or log collector endpoints? The easiest way is to ship the Cloudwatch logstream through a socket client. This is an example of a small Golang lambda function to ship aws cloudwatch log stream to a tcp endpoint.
you will need a socket client:
func SocketClient(m []byte) { conn, err := net.Dial("tcp", "your_tcp_endpoint:your_port") defer conn.Close() if err != nil { fmt.