initial commit [WIP]

This commit is contained in:
Juan Canham 2020-03-17 03:54:15 +00:00
commit 42786bdd92
35 changed files with 1555 additions and 0 deletions

1
.aspell.en.prepl Normal file
View File

@ -0,0 +1 @@
personal_repl-1.1 en 0

69
.aspell.en.pws Normal file
View File

@ -0,0 +1,69 @@
personal_ws-1.1 en 68
AAAAA
RaspberryPi
Grantham
DevOps
TheMajority
canham
statefull
Jinja
ElastiCache
Atlassian
Lamda
backends
unmanaged
Subreddit
subreddit
Config
orchestrator
Github
Gitlab
Rekognition
Calify
Cloudwatch
JuanCanham
pythonic
Cloudreach
Serverless
serverless
countryCode
SysOps
Powershell
Monit
Kaseya's
endDate
studyType
Cloudreach's
FastTrack
piRobotWars
Cloudformation
frontend
knowledgebase
Supporttree
whitepapers
prototyped
datacentre
skunkworks
Milkround
virtualised
Heymarket
Mopidy
CodeDeploy
Kaseya
subskills
hackathons
ChatOps
AppStream
SystemD
Musicbox
MusicBox
Norges
AnthillPro
hackathon
LinkedIn
NBIM's
natively
startDate
Kubernetes
Terraform
Ansible

3
.gitignore vendored Normal file
View File

@ -0,0 +1,3 @@
node_modules/
dist/
*~

9
.yamllint.yaml Normal file
View File

@ -0,0 +1,9 @@
---
extends: default
rules:
line-length: disable
document-start: disable
ignore: |
node_modules/

66
Makefile Normal file
View File

@ -0,0 +1,66 @@
.PHONY: help mrproper all install clean local lint prepare validate build test clear post-test deploy deploy-full
SITENAME = cv.juancanham.com
hackmyresume=./node_modules/.bin/hackmyresume
linkchecker=linkchecker -f .linkcheckerrc --check-extern
help:
@echo 'Targets:'
@echo ' * all [clean install deploy-full]'
@echo ' * local [lint validate build test]'
@echo ' * deploy - does not clear the CDN'
@echo ' * deploy-full [deploy clear post-test]'
@echo ' * mrproper - removes node_modues'
all: clean install deploy-full
local: lint validate build test
deploy-full: deploy clear post-test
mrproper:
rm -rf ./node_modules/
install:
pip install -r requirements.txt
npm install
clean:
rm -r dist/* || true
lint:
shellcheck *.sh deploy/*.sh deploy/shared-functions
spelling=$$(aspell --ignore=4 --home-dir=. list < resume.yaml) ; \
if [ -n "$$spelling" ] ; then echo spelling errors found: $$spelling ; exit 1 ; fi
yamllint .
black *.py
pylint *.py
cfn-lint deploy/cloudformation/*
prepare: clean resume.yaml
./transform.py
./generate_qrcode.py "https://$(SITENAME)" QR
cp -r images dist/
rm dist/images/*~
validate: prepare
$(hackmyresume) validate dist/resume.json
$(hackmyresume) analyze dist/resume.json
build: prepare
./build.sh
test:
$(linkchecker) dist/resume.html
deploy: local
./deploy/deploy-cloudformation.sh $(SITENAME)
aws s3 sync --delete dist/ s3://$(subst .,-,$(SITENAME))
clear:
./deploy/clear-cache.sh $(SITENAME)
post-test:
$(linkchecker) https://$(SITENAME)
wget http://${SITENAME} -O /dev/null

43
Readme.md Normal file
View File

@ -0,0 +1,43 @@
# Juan Canham's Resume Repo
## Motivation
As a pragmatist I want to use existing tools e.g hackmyresume,
however I also want to personalise the input format as the roles on my CV
are less important than the projects and work that I did as part of them.
Additionally I want a single tool used to generate
* My Dynamic Web CV e.g [cv.juancanham.com](https://cv.juancanham.com)
* Public Static CVs
## Contents
This repository contains
* [The source yaml file used to generate my CV](resume.yaml)
* [The resultant markdown](Resume.md)
* The tools to generate my CVs
* [The script used to generate it](build.sh)
* [The settings](settings.yaml)
* [The Intermediary json passed to hackmyresume](resume.json)
* [A script to preform arbitrary transforms on yaml](transform.py)
* The tools to generate my host my CV:
* [cloudformation templates](deploy/cloudformation/)
* [scripts used to deploy them]((deploy/)
* [A makefile to tie it all together](Makefile)
## Dependencies
* node tools (see also [package.json])
* work on at least node 8
* hackmyresume
* kcv
* fresh
* python tools (see also [requirements.txt])
* qrcode
* awscli - used to create the static s3+cloudfront website
* linters
## License
Code: BSD-2-Clause

97
Resume.md Normal file
View File

@ -0,0 +1,97 @@
Juan Barry Manual Canham
============
Email: cv@juancanham.com
Web: https://cv.juancanham.com
A Pragmatic Cloud "DevOps" Engineer, with experience at a variety of companies, across a range of technologies
driving both technological change as well as business focused outcomes.
Capable of wearing whatever hat is needed for a given job, primarily working as both:
* an architect, aligning the technical solutions to the customers requirements
* a technical lead, both delivering code and guiding/mentoring/supporting teams as required.
## SKILLS
- Programming: Python Ruby Bash JavaScript Apex/Java Other Languages
- AWS: Cloudformation IAM Lambda DynamoDB Core AWS services Additional AWS services
- DevOps tools and methodologies: IaC Immutability Configuration Management TDD - Infrastructure TDD - Application Build systems Containers Init systems Agile
- Google: GCP Google deployment manager Google App Engine Google Apps
- Salesforce: Apex Configuration
## EMPLOYMENT
### *Open Source Developer*, [Self](https://juancanham.com) (2019-07 — Present)
Spending a few months developing tools to make engineering in the clouds easier. And other assorted tools
- Quickly Built a tool to view twitter exports
- Begun work on a module
- Built a website to highlight the problems with FPTP
- Built a tool to monitor activity on toxic internet communities
### *Cloud Systems Developer Lead*, [Cloudreach](https://www.cloudreach.com) (2014-03 — 2019-07)
Worked on customer projects as a Lead/Architect and mentored a small team.
- Architect on several Enterprise engagement, at companies such as NBIM, BP, News UK, etc.
- Delivered both the architecture and implementation on multiple Cloud Access models
- Managed a team for 4 engineers, helping them get the most out of working for Cloudreach
- Helped run a city government hackathon (TFL)
### *Role Owner (Cloud Systems Developers)*, [Cloudreach](https://www.cloudreach.com) (2016-06 — 2019-07)
Worked with the leadership team to improve the System Developers role.
- Helped Engineers get improve their technical skills though a hands-on training workshop program
- Trained and mentored multiple sets of graduates
### *Internal hackathons/skunkworks Developer*, [Cloudreach](https://www.cloudreach.com) (2012-02 — 2019-07)
While at Cloudreach, worked on various spikes and hackathons
- Built Automated tanks that used image recognition to move and fire at each other.
- Built various useful internal tools, that remained in use for years
- Built a variety of IaC tools, that made deployments easier
- Won a trip to Central America
- Had project open sourced externally, by somebody that found it useful
### *Cloud Systems Developer*, [Cloudreach](https://www.cloudreach.com) (2012-02 — 2014-03)
Worked on technical projects on AWS, Google Apps & Salesforce both individually and as part of a team.
- Worked across 3 cloud platforms (Google, AWS, Salesforce)
- Delivered difficult Google 2 Google migrations on tight deadlines
### *Support Engineer*, [Supporttree](https://supporttree.co.uk) (2010-01 — 2012-02)
Full support for small businesses including end user systems, on-premise servers and cloud services.
- Worked in a user facing role, for a variety of small businesses
- Made use of automation and unix, in an otherwise manual windows environment
## EDUCATION
### UCL (2006-09 — 2009-07)
### King's School Grantham (2004-09 — 2006-07)
## INTERESTS
- OPEN SOURCE & LINUX
- TRAVELLING
- PUNK ROCK, POLITICS & THE PUB

41
build.sh Executable file
View File

@ -0,0 +1,41 @@
#!/usr/bin/env bash
set -e
PATH="./node_modules/.bin/:$PATH"
for content in $(yq -r '.generate | keys[]' settings.yaml); do
echo "=== Generating ${content} themes with hackmyresume ==="
for theme in $(yq -r ".generate.${content}.plain[]?" settings.yaml); do
hackmyresume build "dist/${content}.json" TO "dist/${theme##*-}.all" -t "$theme"
done
echo "=== Generating ${content} themes with css ==="
if yq -ce ".generate.${content}.css" settings.yaml; then
for theme in $(yq -r ".generate.${content}.css | keys[]" settings.yaml); do
for css in $(yq -r ".generate.${content}.css[\"$theme\"][]" settings.yaml); do
hackmyresume build "dist/${content}.json" TO "dist/${css}.all" -t "./node_modules/$theme" --css "$css"
done
done
fi
echo "=== Generating ${content} themes with kcv ==="
hackmyresume convert "dist/${content}.json" resume.json
for theme in $(yq -r ".generate.${content}.kcv[]?" settings.yaml); do
for format in html pdf; do
kcv export "dist/kcv-${theme}.${format}" --theme "$theme"
done
done
mv resume.json "dist/${content}.jrs.json"
done
rm dist/*.*.html
for ext in $(yq -r '.primary | keys[]' settings.yaml); do
cp "dist/$(niet ".primary.$ext" settings.yaml).$ext" "dist/resume.$ext"
done
cp dist/"$(niet '.primary.md' settings.yaml)".md Resume.md
cp dist/full.jrs.json ./resume.jrs.json
cp dist/resume.json ./
cp resume.yaml dist/

9
deploy/clear-cache.sh Executable file
View File

@ -0,0 +1,9 @@
#!/usr/bin/env bash
set -e
# shellcheck disable=SC1090
. "$( dirname "${BASH_SOURCE[0]}" )/shared-functions"
DISTRIBUTION=$(get-output site CloudfrontDistribution)
INVALIDATION=$(aws cloudfront create-invalidation --paths '/*' --distribution-id "$DISTRIBUTION" --query Invalidation.Id --output text)
aws cloudfront wait invalidation-completed --distribution-id "$DISTRIBUTION" --id "$INVALIDATION"

View File

@ -0,0 +1,22 @@
AWSTemplateFormatVersion: '2010-09-09'
Description: Template for an SSL certificate (must be deployed in us-east-1 for cloudfront)
Metadata:
License: magnet:?xt=urn:btih:1f739d935676111cfff4b4693e3816e664797050&dn=gpl-3.0.txt GPL-v3-or-Later
Parameters:
SiteName:
Type: String
Description: Name for the site
AllowedPattern: '[a-zA-Z0-9-.]{1,63}'
ConstraintDescription: must be a valid DNS name.
Resources:
Certificate:
Type: AWS::CertificateManager::Certificate
Properties:
DomainName: !Ref SiteName
ValidationMethod: DNS
Outputs:
CertificateARN:
Value: !Ref Certificate

View File

@ -0,0 +1,94 @@
AWSTemplateFormatVersion: '2010-09-09'
Description: Iam Roles for account running a static S3 website with cloudfront (IAMAdmin, Admin, S3Admin/User)
Metadata:
License: magnet:?xt=urn:btih:1f739d935676111cfff4b4693e3816e664797050&dn=gpl-3.0.txt GPL-v3-or-Later
Resources:
IAMAdmin:
Type: AWS::IAM::Role
Properties:
RoleName: IAMAdmin
ManagedPolicyArns:
- arn:aws:iam::aws:policy/AWSCloudFormationFullAccess
- arn:aws:iam::aws:policy/IAMFullAccess
- arn:aws:iam::aws:policy/ReadOnlyAccess
AssumeRolePolicyDocument:
Version: '2012-10-17'
Statement:
- Effect: Allow
Principal:
AWS: arn:aws:iam::212707113393:root
Action: sts:AssumeRole
- Effect: Allow
Principal:
Federated:
- !Sub arn:aws:iam::${AWS::AccountId}:saml-provider/Google
Action: sts:AssumeRoleWithSAML
Condition:
StringEquals:
SAML:aud: https://signin.aws.amazon.com/saml
Admin:
Type: AWS::IAM::Role
Properties:
RoleName: Admin
ManagedPolicyArns:
- arn:aws:iam::aws:policy/ReadOnlyAccess
- arn:aws:iam::aws:policy/AWSCloudFormationFullAccess
- arn:aws:iam::aws:policy/AmazonS3FullAccess
- arn:aws:iam::aws:policy/AmazonRoute53FullAccess
- arn:aws:iam::aws:policy/CloudFrontFullAccess
- arn:aws:iam::aws:policy/AWSCertificateManagerFullAccess
AssumeRolePolicyDocument:
Version: '2012-10-17'
Statement:
- Effect: Allow
Principal:
AWS: arn:aws:iam::212707113393:root
Action: sts:AssumeRole
- Effect: Allow
Principal:
Federated:
- !Sub arn:aws:iam::${AWS::AccountId}:saml-provider/Google
Action: sts:AssumeRoleWithSAML
Condition:
StringEquals:
SAML:aud: https://signin.aws.amazon.com/saml
User:
Type: AWS::IAM::Role
Properties:
RoleName: User
ManagedPolicyArns:
- arn:aws:iam::aws:policy/ReadOnlyAccess
- !Ref UserPolicy
AssumeRolePolicyDocument:
Version: '2012-10-17'
Statement:
- Effect: Allow
Principal:
AWS: arn:aws:iam::212707113393:root
Action: sts:AssumeRole
- Effect: Allow
Principal:
Federated:
- !Sub arn:aws:iam::${AWS::AccountId}:saml-provider/Google
Action: sts:AssumeRoleWithSAML
Condition:
StringEquals:
SAML:aud: https://signin.aws.amazon.com/saml
UserPolicy:
Type: AWS::IAM::ManagedPolicy
Properties:
Description: Grants Access to Created/Update/Delete S3 Objects & Clear Cloudfront caches
PolicyDocument:
Version: 2012-10-17
Statement:
- Effect: Allow
Action:
- s3:DeleteObject
- s3:PutObject
Resource: 'arn:aws:s3:::*/*'
- Effect: Allow
Action: cloudfront:CreateInvalidation
Resource: '*'

View File

@ -0,0 +1,99 @@
AWSTemplateFormatVersion: '2010-09-09'
Description: Template will deploy an s3 bucket, Route53Zone & SSL certificate to host a static website
Metadata:
License: magnet:?xt=urn:btih:1f739d935676111cfff4b4693e3816e664797050&dn=gpl-3.0.txt GPL-v3-or-Later
Parameters:
SiteName:
Type: String
Description: Name for the site
AllowedPattern: '[a-zA-Z0-9-.]{1,63}'
ConstraintDescription: must be a valid DNS name.
CertificateARN:
Type: String
Resources:
Bucket:
Type: AWS::S3::Bucket
Properties:
BucketName: !Join [-, !Split [., !Ref SiteName]]
AccessControl: PublicRead
WebsiteConfiguration:
ErrorDocument: resume.html
IndexDocument: resume.html
BucketPolicy:
Type: AWS::S3::BucketPolicy
Properties:
Bucket: !Ref Bucket
PolicyDocument:
Version: 2012-10-17
Statement:
- Sid: PublicReadGetObject
Effect: Allow
Principal: '*'
Action: s3:GetObject
Resource:
- !Sub ${Bucket.Arn}/*
Route53Zone:
Type: AWS::Route53::HostedZone
Properties:
HostedZoneConfig:
Comment: !Sub 'hosted zone for ${SiteName}'
Name: !Ref SiteName
Route53RecordIPv4:
Type: AWS::Route53::RecordSet
Properties:
AliasTarget:
DNSName: !GetAtt CloudfrontDistribution.DomainName
HostedZoneId: Z2FDTNDATAQYW2
HostedZoneId: !Ref Route53Zone
Name: !Ref SiteName
Type: A
Route53RecordIPv6:
Type: AWS::Route53::RecordSet
Properties:
AliasTarget:
DNSName: !GetAtt CloudfrontDistribution.DomainName
HostedZoneId: Z2FDTNDATAQYW2
HostedZoneId: !Ref Route53Zone
Name: !Ref SiteName
Type: AAAA
CloudfrontDistribution:
Type: AWS::CloudFront::Distribution
Properties:
DistributionConfig:
Aliases:
- !Ref SiteName
Enabled: true
HttpVersion: http2
IPV6Enabled: true
PriceClass: PriceClass_100
DefaultRootObject: resume.html
ViewerCertificate:
AcmCertificateArn: !Ref CertificateARN
SslSupportMethod: sni-only
Origins:
- Id: bucket
DomainName: !GetAtt Bucket.RegionalDomainName
S3OriginConfig:
OriginAccessIdentity: ''
DefaultCacheBehavior:
DefaultTTL: 3600
TargetOriginId: bucket
ViewerProtocolPolicy: allow-all
Compress: true
ForwardedValues:
QueryString: false
Outputs:
HostedZoneId:
Value: !Ref Route53Zone
HostedZoneRecords:
Value: !Join [",", !GetAtt Route53Zone.NameServers]
CloudfrontDistribution:
Value: !Ref CloudfrontDistribution

14
deploy/deploy-cloudformation.sh Executable file
View File

@ -0,0 +1,14 @@
#!/usr/bin/env bash
set -e
# shellcheck disable=SC1090
. "$( dirname "${BASH_SOURCE[0]}" )/shared-functions"
export AWS_DEFAULT_REGION=us-east-1
deploy certificate SiteName="$SITENAME"
CERT=$(get-output certificate CertificateARN)
unset AWS_DEFAULT_REGION
deploy site SiteName="$SITENAME" CertificateARN="$CERT"

18
deploy/shared-functions Normal file
View File

@ -0,0 +1,18 @@
#!/usr/bin/env bash
SITENAME=$1
PROJECT=${SITENAME//./}
CFN_DIR="$( dirname "${BASH_SOURCE[0]}" )/cloudformation"
DEPLOY_CMD="aws cloudformation deploy --no-fail-on-empty-changeset --tags Classification=Public Site=$SITENAME"
deploy() {
# shellcheck disable=SC2145
echo "deploying $1 [${@:2}]"
# shellcheck disable=SC2068
$DEPLOY_CMD --stack-name "${PROJECT}-$1" --template-file "${CFN_DIR}/$1.yaml" --parameter-overrides ${@:2}
aws cloudformation wait stack-exists --stack-name "${PROJECT}-$1"
}
get-output() {
aws cloudformation describe-stacks --stack-name "${PROJECT}-$1" --query "Stacks[0].Outputs[?OutputKey==\`$2\`].OutputValue" --output text
}

22
generate_qrcode.py Executable file
View File

@ -0,0 +1,22 @@
#!/usr/bin/env python
"Creates a QR code with a smaller border than the standard cli"
import sys
import os
from qrcode import QRCode
def generate_qrcode(site, filename):
""" Function to generate QR code wth boarder of 1 """
qrcode = QRCode(border=1)
qrcode.add_data(site)
qrcode.make()
img = qrcode.make_image()
filename = os.path.join("images", filename)
filename += ".png"
img.save(filename)
if __name__ == "__main__":
generate_qrcode(sys.argv[1], sys.argv[2])

BIN
images/AWS-DevOps-Pro.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 9.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.7 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 8.4 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 8.7 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 8.7 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 16 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 14 KiB

BIN
images/LPCI-1.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 16 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 23 KiB

BIN
images/QR.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 498 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 18 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 22 KiB

BIN
images/qr.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.5 KiB

29
package.json Normal file
View File

@ -0,0 +1,29 @@
{
"name": "juan-canham-resume",
"version": "0.9.0",
"description": "Juan Canham's Resume/CV",
"dependencies": {
"hackmyresume": "^1.8.0",
"kcv-cli": "^1.3.0",
"kcv-theme-fresh": "^0.0.4",
"resume-cli": "^1.2.7",
"jsonresume-theme-modern": "git+https://git.juancanham.com/JuanCanham/jsonresume-theme-modern.git",
"fresh-theme-bootstrap": "git+https://git.juancanham.com/JuanCanham/fresh-theme-bootstrap.git",
"fresh-theme-elegant": "git+https://git.juancanham.com/JuanCanham/fresh-theme-elegant.git#feature/interactive",
"fresh-themes": "git+https://git.juancanham.com/JuanCanham/fresh-themes.git#feature/certifications",
},
"devDependencies": {},
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"repository": {
"type": "git",
"url": "https://git.juancanham.com/JuanCanham/juan-canham-resume"
},
"keywords": [
"resume",
"CV"
],
"author": "Juan Canham <juan@juancanham.com>",
"license": "BSD-2-Clause"
}

7
requirements.txt Normal file
View File

@ -0,0 +1,7 @@
awscli==1.18.*
black==19.10b0
cfn-lint==0.28.*
pylint==2.*
qrcode==6.*
yamllint==1.20.*
image==1.5.28

319
resume.jrs.json Normal file
View File

@ -0,0 +1,319 @@
{
"basics": {
"name": "Juan Barry Manual Canham",
"label": "Cloud \"DevOps\" Engineer",
"summary": "A Pragmatic Cloud \"DevOps\" Engineer, with experience at a variety of companies, across a range of technologies\ndriving both technological change as well as business focused outcomes.\nCapable of wearing whatever hat is needed for a given job, primarily working as both:\n* an architect, aligning the technical solutions to the customers requirements\n* a technical lead, both delivering code and guiding/mentoring/supporting teams as required.\n",
"website": "https://cv.juancanham.com",
"email": "cv@juancanham.com",
"picture": "images/QR.png",
"location": {
"city": "London",
"countryCode": "England",
"region": "EU"
},
"profiles": [
{
"network": "Git",
"username": "jc",
"url": "http://git.juancanham.com/"
},
{
"network": "LinkedIn",
"username": "juan-canham-aa005a51",
"url": "https://www.linkedin.com/in/juan-canham-aa005a51/"
},
{
"network": "Github",
"username": "JuanCanham",
"url": "https://github.com/juancanham"
}
]
},
"work": [
{
"company": "Self",
"website": "https://juancanham.com",
"position": "Open Source Developer",
"startDate": "2019-07",
"summary": "Spending a few months developing tools to make engineering in the clouds easier. And other assorted tools",
"highlights": [
"Quickly Built a tool to view twitter exports",
"Begun work on a module",
"Built a website to highlight the problems with FPTP",
"Built a tool to monitor activity on toxic internet communities"
]
},
{
"company": "Cloudreach",
"website": "https://www.cloudreach.com",
"position": "Cloud Systems Developer Lead",
"startDate": "2014-03",
"endDate": "2019-07",
"summary": "Worked on customer projects as a Lead/Architect and mentored a small team.",
"highlights": [
"Architect on several Enterprise engagement, at companies such as NBIM, BP, News UK, etc.",
"Delivered both the architecture and implementation on multiple Cloud Access models",
"Managed a team for 4 engineers, helping them get the most out of working for Cloudreach",
"Helped run a city government hackathon (TFL)"
]
},
{
"company": "Cloudreach",
"website": "https://www.cloudreach.com",
"position": "Role Owner (Cloud Systems Developers)",
"startDate": "2016-06",
"endDate": "2019-07",
"summary": "Worked with the leadership team to improve the System Developers role.",
"highlights": [
"Helped Engineers get improve their technical skills though a hands-on training workshop program",
"Trained and mentored multiple sets of graduates"
]
},
{
"company": "Cloudreach",
"website": "https://www.cloudreach.com",
"position": "Internal hackathons/skunkworks Developer",
"startDate": "2012-02",
"endDate": "2019-07",
"summary": "While at Cloudreach, worked on various spikes and hackathons",
"highlights": [
"Built Automated tanks that used image recognition to move and fire at each other.",
"Built various useful internal tools, that remained in use for years",
"Built a variety of IaC tools, that made deployments easier",
"Won a trip to Central America",
"Had project open sourced externally, by somebody that found it useful"
]
},
{
"company": "Cloudreach",
"website": "https://www.cloudreach.com",
"position": "Cloud Systems Developer",
"startDate": "2012-02",
"endDate": "2014-03",
"summary": "Worked on technical projects on AWS, Google Apps & Salesforce both individually and as part of a team.",
"highlights": [
"Worked across 3 cloud platforms (Google, AWS, Salesforce)",
"Delivered difficult Google 2 Google migrations on tight deadlines"
]
},
{
"company": "Supporttree",
"website": "https://supporttree.co.uk",
"position": "Support Engineer",
"startDate": "2010-01",
"endDate": "2012-02",
"summary": "Full support for small businesses including end user systems, on-premise servers and cloud services.",
"highlights": [
"Worked in a user facing role, for a variety of small businesses",
"Made use of automation and unix, in an otherwise manual windows environment"
]
}
],
"education": [
{
"institution": "UCL",
"startDate": "2006-09",
"endDate": "2009-07"
},
{
"institution": "King's School Grantham",
"gpa": "AAAAA",
"courses": [
"Maths",
"Further Maths",
"Physics",
"Chemistry",
"Spanish"
],
"startDate": "2004-09",
"endDate": "2006-07"
}
],
"skills": [
{
"name": "Programming",
"level": "Skilled",
"keywords": [
{
"name": "Python",
"level": "Skilled",
"summary": "Primary language used for most projects"
},
{
"name": "Ruby",
"level": "Skilled",
"summary": "Primary language used for some projects, also often used as part of Chef or other tools used within projects"
},
{
"name": "Bash",
"level": "Skilled",
"summary": "Used extensively for system automation"
},
{
"name": "JavaScript",
"level": "Knowledgeable",
"summary": "Experience with both frontend development and Node to deliver automation"
},
{
"name": "Apex/Java",
"level": "Basic",
"summary": "Primary language used for Salesforce development"
},
{
"name": "Other Languages",
"level": "Novice",
"summary": "Powershell, Vbs, batch, PHP, Perl, C, C#"
}
]
},
{
"name": "AWS",
"level": "Skilled",
"keywords": [
{
"name": "Cloudformation",
"level": "Skilled",
"summary": "Both natively and via Troposphere, Jinja & ruby-DSL"
},
{
"name": "IAM",
"level": "Skilled",
"summary": "Extensive knowledge in building secured multi-tenanted accounts"
},
{
"name": "Lambda",
"level": "Knowledgeable",
"summary": "Used extensively with both Python & JavaScript as part of stand-alone components & complex frameworks"
},
{
"name": "DynamoDB",
"level": "Knowledgeable",
"summary": "Used in-depth to store state for lambda, whenever S3 was not viable"
},
{
"name": "Core AWS services",
"level": "Skilled",
"summary": "Have used EC2, S3, RDS, SNS, SQS, Cloudwatch, Config, ElastiCache, etc, extensively"
},
{
"name": "Additional AWS services",
"level": "Knowledgeable",
"summary": "Have used other services such as Direct Connect, CodeDeploy, AppStream, etc"
}
]
},
{
"name": "DevOps tools and methodologies",
"level": "Skilled",
"keywords": [
{
"name": "IaC",
"level": "Skilled",
"summary": "Always used Infrastructure as Code (IaC), primarily native tools, but also Terraform when needed"
},
{
"name": "Immutability",
"level": "Skilled",
"summary": "Designed and implemented immutable systems"
},
{
"name": "Configuration Management",
"level": "Skilled",
"summary": "Extensive use of configuration management tools such as Chef and Ansible when needed"
},
{
"name": "TDD - Infrastructure",
"level": "skilled",
"summary": "Used test driven development (TDD) on most greenfield projects and also gradually retrofitted to legacy infrastructure"
},
{
"name": "TDD - Application",
"level": "Moderate"
},
{
"name": "Build systems",
"level": "skilled",
"summary": "Always used Build systems (Jenkins, AnthillPro, Code*, Gitlab, etc)"
},
{
"name": "Containers",
"level": "Knowledge",
"summary": "Familiar with immutable image pipelines and methodologies"
},
{
"name": "Init systems",
"level": "Knowledgeable",
"summary": "Used Upstart/SystemD/SysV/Monit as required, in particular use of signals/sockets when possible"
},
{
"name": "Agile",
"level": "Moderate",
"summary": "Used both within Small companies and adapted Larger Enterprise"
}
]
},
{
"name": "Google",
"level": "Knowledgeable",
"keywords": [
{
"name": "GCP",
"level": "Moderate",
"summary": "Developed high level account strategies for GCP"
},
{
"name": "Google deployment manager",
"level": "Moderate",
"summary": "Developed transformation tool using Google Deployment Manager"
},
{
"name": "Google App Engine",
"level": "Moderate",
"summary": "Some use as part of projects, primarily in Python"
},
{
"name": "Google Apps",
"level": "Skilled",
"summary": "Extensive use as part of Google Apps projects"
}
]
},
{
"name": "Salesforce",
"level": "Moderate",
"keywords": [
{
"name": "Apex",
"level": "Moderate",
"summary": "Experience writing both small classes and triggers"
},
{
"name": "Configuration",
"level": "Basic",
"summary": "Experience configuring Salesforce estates as per customer requirements"
}
]
}
],
"interests": [
{
"name": "Open Source & Linux"
},
{
"name": "Travelling"
},
{
"name": "Punk Rock, Politics & the Pub"
}
],
"languages": [
{
"language": "English",
"fluency": "Native"
},
{
"language": "Spanish",
"fluency": "Native"
}
]
}

1
resume.json Normal file

File diff suppressed because one or more lines are too long

479
resume.yaml Normal file
View File

@ -0,0 +1,479 @@
---
name: Juan Barry Manual Canham
contact:
email: "cv@juancanham.com"
website: https://cv.juancanham.com
info:
label: Cloud "DevOps" Engineer
image: images/QR.png
brief: |
A Pragmatic Cloud "DevOps" Engineer, with experience at a variety of companies, across a range of technologies
driving both technological change as well as business focused outcomes.
Capable of wearing whatever hat is needed for a given job, primarily working as both:
* an architect, aligning the technical solutions to the customers requirements
* a technical lead, both delivering code and guiding/mentoring/supporting teams as required.
location:
city: London
county: Greater London
country: England
region: EU
countryCode: GB
social:
- label: Git
network: Git
user: jc
url: http://git.juancanham.com/
- label: LinkedIn
network: LinkedIn
user: juan-canham-aa005a51
url: https://www.linkedin.com/in/juan-canham-aa005a51/
- label: Github
network: Github
user: JuanCanham
url: https://github.com/juancanham
certifications:
- organisation: AWS
certificates:
- name: Solutions Architect
level: Professional
logo: AWS-Solutions-Architect-Pro.png
- name: DevOps
level: Professional
logo: AWS-DevOps-Pro.png
- name: Solutions Architect
level: Associate
logo: AWS-Solutions-Architect-Associate.png
- name: SysOps Administrator
level: Associate
logo: AWS-Sysops-Associate.png
- name: Developer
level: Associate
logo: AWS-Developer-Associate.png
- organisation: Google
certificates:
- name: Cloud Architect
level: Professional
logo: GCP-Cloud-Architect-Pro.png
- name: Data Engineer
level: Professional
logo: GCP-Data-Engineer-Pro.png
- name: Google Apps Deployment Specialist
- organisation: Linux Professional Institute
certificates:
- name: LPCI-1
logo: LPCI-1.png
- organisation: Salesforce
certificates:
- name: Force.com Developer
logo: Salesforce-Platform-Developer.png
- name: Administrator
logo: Salesforce-administrator.png
- organisation: Microsoft
certificates:
- name: Windows 7, Configuration
level: Microsoft Certified Technology Specialist
employment:
summary: "9+ years cloud infrastructure experience as engineer, technical lead & architect"
history:
- employer: Self
url: https://juancanham.com
technologies: [Cloud, AWS, GCP, Azure, SSO, Open Source]
position: Open Source Developer
summary: Spending a few months developing tools to make engineering in the clouds easier. And other assorted tools
start: 2019-07
highlights:
- Quickly Built a tool to view twitter exports
- Begun work on a module
- Built a website to highlight the problems with FPTP
- Built a tool to monitor activity on toxic internet communities
projects:
- name: Offline Twitter Export Viewer
summary: Simple tool to view twitter exports offline
url: https://gitlab.com/RitoingPacifst/offline-twitter-export-viewer
- name: Generic AWS Saml CLI (WIP)
summary: Generic SAML tool for AWS, to work with multiple providers and multiple backends using a modular pythonic design.
- name: TheMajority.uk
summary: Website generator combining markdown, Jinja & yaml. Also a website about proportional representation
url: https://gitlab.com/TheMajority/themajority.uk
- name: Subreddit Monitor
summary: Serverless Python bot, to monitor subreddit activity for cross-posts and notify users
- position: Cloud Systems Developer Lead
summary: Worked on customer projects as a Lead/Architect and mentored a small team.
employer: Cloudreach
url: https://www.cloudreach.com
description: |
Worked on customer projects as a Tech Lead/Architect.
Managed a team of 3 or 4 engineers within Cloudreach, making sure they got the most out of their role at Cloudreach,
aligning the individuals personal development plans both with Cloudreach's goals and their longer term career trajectories.
It was particularly rewarding was getting team members of promoted to Tech-lead level.
start: 2014-03
end: 2019-07
keywords: [Leadership, Mentoring, Architect, Tech Lead]
highlights:
- Architect on several Enterprise engagement, at companies such as NBIM, BP, News UK, etc.
- Delivered both the architecture and implementation on multiple Cloud Access models
- Managed a team for 4 engineers, helping them get the most out of working for Cloudreach
- Helped run a city government hackathon (TFL)
projects:
- name: Cloud Migration
customer: Norges Bank Investment Management
summary: Formed part of the CCOE supporting NBIM's datacentre exit onto immutable infrastructure in AWS
description: |
NBIM had an aggressive migration strategy, based on a standardised pipeline approach in order to
re-platform 150 applications into immutably deployed services within 8 months.
As part of the CCOE, helped build and maintain the pipeline (Cloudformation, Ansible, Packer, Jinja,
Powershell) in order to achieve this.
As well as support teams migrating applications, migrate applications and ensure best practices were
applied at an account level.
keywords: [CCOE, Deployment Pipeline, Immutable]
technologies: [AWS, Packer, Cloudformation, Ansible, Jinja, Python, Windows, Linux]
- name: AWS CIP
customer: BP
summary: Architect for BP's Cloud infrastructure Platform (AWS)
description: |
Architect/Tech Lead on the AWS side of BP's Cloud infrastructure Platform (CIP), responsible for
- Guiding high-level goals
- Interfacing with other teams and gather requirements at a technical level
- Aligning high-level & mid-Level architectures
- RBAC engine in a Multi-Tenanted account
- Supporting the team and ensure code quality for services and customers
Transitioned the project from an EC2-focused offering in 1 region, with 4 supported OSes,
to a managed cloud native datacentre, offering use of 20 AWS services, across 2 regions with 6 supported OSes,
as well as aligning future visions towards a many-account model.
keywords: [Strategy, RBAC, Multi-Tenanted, Platform, Shared Services, Landing Zone, Multi-Region]
technologies: [AWS, IAM, Cloudformation, Windows, Linux, Python]
- name: Multi Cloud Lab Vending Engine
customer: Cloudreach
summary: Replace Long lived Multi-User Labs with an on-demand Multi Cloud Lab Vending Engine
description: |
Transformed shared per-cloud lab accounts, into an SSO integrated multi-cloud (AWS, Azure, GCP)
on-demand vending engine, improving security, while reducing cost and increasing accountability.
In addition for pushing for the change on the business side, designed the API for integrating with
the serverless per-cloud solutions,
implement the central orchestration workflows in Google Apps script (JavaScript),
Wrote the serverless wrapper (python) that provided the link between AWS's Landing Zone product and the
orchestrator.
keywords: [Multi-cloud, Strategy, Portal, Landing Zone, Multi-Account, Serverless, SSO, API Design]
technologies: [AWS, Azure, GCP, Account Vending Engine, JavaScript, Stateless, Python, Lambda, SAM]
- name: TFL hackathon
customer: TFL
summary: Helped Run and Judge TFL Active Travel Hackathon
description: |
Provided expertise for teams making use of AWS while building solutions using TFL, Met Office &
Ordinance Survey's APIs.
keywords: [AWS, Hackathon, APIs]
technologies: [AWS]
- name: Pearson Governance Framework and Prototype
customer: Pearson
summary: Designed and prototyped Pearson's AWS governance strategy
description: |
Produced a cloud adoption and governance strategy, to reduce the unmanaged spend across 100+ accounts.
Provided a design for account structure, access, networking, security, monitoring, cost allocation and deployment.
Led team building a serverless monitoring and triage framework (similar to cloud guardian/current AWS whitepapers),
and target account creation and hardening.
keywords: [Strategy, Multi-Tenanted, Platform, Shared Services, Landing Zone, Serverless]
technologies: [AWS, Cloudformation, Windows, Linux, Python]
- position: Role Owner (Cloud Systems Developers)
summary: Worked with the leadership team to improve the System Developers role.
employer: Cloudreach
url: https://www.cloudreach.com
description: |
Worked with the leadership team to improve the System Developers role.
This included both technical tooling and non-technical initiatives, processes, such as training programs,
interview processes and the graduate program.
start: 2016-06
end: 2019-07
keywords: [Strategy, Vision, Internal, Personal Development]
highlights:
- Helped Engineers get improve their technical skills though a hands-on training workshop program
- Trained and mentored multiple sets of graduates
projects:
- name: Personal Growth workshops
summary: Introduced monthly hands-on training workshops.
description: |
By using in-house experts, to deliver 1/2 day, regional hands-on, realistic, workshops,
the program gives engineers a chance to use emerging technologies on realistic use cases, rather than under idea situations.
For example, it's easy to setup a Kubernetes demo, but most introductions, don't cover dealing with sidecars and statefull containers.
The workshops also gave experienced team engineers a change to showcase their skills and produce content for our knowledgebase.
As the workshops were run separately it was also an opportunity for the American and European offices to collaborate
on training materials, without having to deal with time-zones for the delivery.
keywords: [Personal Development, Training, Strategy, DevOps]
- name: Graduate/FastTrack Trainer/coordinator
summary: Involved in the delivery of 6 Fast Track training programs.
description: |
The Graduate/Fast Track program was a 8-10 week training course initially for recent graduates
and later for anybody keen to retrain and start working in cloud based DevOps.
Responsible for at least one, two week course on either AWS or infrastructure as code, in every program,
as the cloud computing progresses rapidly this meant refreshing course materials and delivering,
a hands-on course to groups of 6-20.
In addition to delivering the training also worked with the leadership team and managers to ensure
graduates were placed onto appropriate projects.
keywords: [Personal Development, Training]
technologies: [AWS, IaC]
- name: Interview Process update
summary: Responsible for refining and updating the interview process
description: |
Refined the interview process through two major iterations,
First standardising the process across all our European and American offices,
being more prescriptive in terms of scoring.
The second introduced somewhat objective scoring criteria, while still giving interviewers enough scope to
keywords: [Interviews, Metrics]
- name: Unified Chatroom system & Knowledge base
summary: Helped establish a company wide unified Chatroom system & Knowledge base.
description: |
Standardising the company on a single chat and knowledge base, made it much easier for new colleagues to get up to
speed. Most of this was focusing on making the case for unifying the tooling (in this case Slack and Atlassian),
making sure there was a path forward for all the teams involved that wasn't seen as a regression was important.
The end result was much more cross-department knowledge sharing, mostly along technical lines,
however it also helped organise events (both globally and office specific).
keywords: [ChatOps, Knowledge Sharing]
technologies: [Slack]
- position: Internal hackathons/skunkworks Developer
summary: While at Cloudreach, worked on various spikes and hackathons
employer: Cloudreach
url: https://www.cloudreach.com
description: |
Used various competitions and opportunities to build prototypes to demonstrate feasibility of tools,
as they were built over short periods of time, they were not production ready,
however they worked and often fed into the approaches used in projects.
start: 2012-02
end: 2019-07
keywords: [hackathon]
highlights:
- Built Automated tanks that used image recognition to move and fire at each other.
- Built various useful internal tools, that remained in use for years
- Built a variety of IaC tools, that made deployments easier
- Won a trip to Central America
- Had project open sourced externally, by somebody that found it useful
projects:
- name: piRobotWars
summary: Automated tanks that used image recognition to move and fire at each other.
technologies: [Rekognition, RaspberryPi, Soldering, Python, Lambda]
- name: Calify
summary: Room booking system, based on Google calendar and android.
technologies: [Android, Java, Google Apps]
- name: Cloudformation Supremacy Engine
summary: Tools to facilitate better deployment of Cloudformation
technologies: [Python, Terraform, Cloudformation]
- name: Serverless Sceptre
summary: Tool for event driven Cloudformation deployments
technologies: [Lamda, IaC, Cloudformation, Python]
- name: Address Book, Contact sync, Holiday booking system
summary: Google apps automation used internally for booking holidays and syncing to mobile devices
technologies: [Google Apps, JavaScript]
- name: MusicBox
summary: Various iterations of the Office jukebox systems
technologies: [RaspberryPi, Musicbox, Mopidy, JavaScript]
- position: Cloud Systems Developer
employer: Cloudreach
url: https://www.cloudreach.com
summary: Worked on technical projects on AWS, Google Apps & Salesforce both individually and as part of a team.
start: 2012-02
end: 2014-03
highlights:
- Worked across 3 cloud platforms (Google, AWS, Salesforce)
- Delivered difficult Google 2 Google migrations on tight deadlines
projects:
- name: Salesforce Roll-out for Media Group
customer: Heymarket
summary: Re-implemented their sales processes in Salesforce and configured production & sandbox accounts
description: |
Part of the team doing the initial roll out to replace legacy CRM systems, did the initial requirements gathering,
then matched the existing processes to those of Salesforce to fully automate the process for transforming
digital media leads to cash.
This involved both customising Salesforce objects/processes and writing apex triggers and classes
(along with the tests required by the platform).
keywords: [Processes]
technologies: [Salesforce, Apex, SOQL]
- name: Cross Team Support at a Large Media company
customer: News UK
description: |
Working as the cloud lead systems engineer inside a large media corporation,
supported the infrastructure for all products being developed by partners on AWS,
as well as the build servers used to support those (and other) deployments.
Alongside supporting the running development environments,
developed and rolled out a unified platform & development kit to simplify application and
infrastructure, build, deployment, monitoring and support,
while reducing the overall support burden at the company by,
helping other teams standardise on the platform where practical.
- name: Google to Google Migrations
customer: Various (Milkround, News UK, etc)
summary: Helped migrate various customers between Google Apps accounts
description: |
Due to limitations in the Google Apps platform, the source account had to be completely deleted prior to the end of the
migration, this means the entire migration had to be completed, including fixes and validation of data within about 60 hours.
When combined with the user facing nature of mail migrations, this resulted in particularly difficult migrations,
with tight deadlines
keywords: [Migration, User Facing]
technologies: [Google Apps, Python, AWS]
- name: Google Migrations
customer: Various (Jamie Oliver group, Graze, etc)
summary: Helped migrate various customers between Google Apps accounts
keywords: [Migration, User Facing]
technologies: [Google Apps, Python, AWS]
- name: Teletext Holidays Optimisation
customer: Teletext Holidays
summary: Provided guidance and recommendations for frontend optimisation of the Teletext Holidays website
keywords: [Frontend, Web, Optimisation]
technologies: [JavaScript, Web]
- employer: Supporttree
position: Support Engineer
summary: Full support for small businesses including end user systems, on-premise servers and cloud services.
description: |
Providing 1st to 3rd line support of Desktops (Windows and OS X),
Server (physical and virtualised) & Services (in-house and cloud based) for small businesses.
In addition to end-user support, also implemented several projects.
While the main toolkit for automation was Kaseya's custom DSL, pushed for automation whenever possible,
usually making use of either custom code or portable GNU tools.
start: 2010-01
end: 2012-02
url: https://supporttree.co.uk
keywords: [User Facing]
technologies: [Windows, OS X, Linux, Networking]
highlights:
- Worked in a user facing role, for a variety of small businesses
- Made use of automation and unix, in an otherwise manual windows environment
education:
level: Bachelor [incomplete]
history:
- institution: UCL
title: Bachelor Natural Sciences (Physics & Chemistry) [incomplete]
start: 2006-09
end: 2009-07
- institution: King's School Grantham
title: A-Levels
start: 2004-09
end: 2006-07
grade: AAAAA
curriculum:
- Maths
- Further Maths
- Physics
- Chemistry
- Spanish
skills:
levels: [Novice, Basic, Moderate, Knowledgeable, Skilled]
sets:
- name: Programming
level: Skilled
skills:
- name: Python
level: Skilled
summary: Primary language used for most projects
- name: Ruby
level: Skilled
summary: Primary language used for some projects, also often used as part of Chef or other tools used within projects
- name: Bash
level: Skilled
summary: Used extensively for system automation
- name: JavaScript
level: Knowledgeable
summary: Experience with both frontend development and Node to deliver automation
- name: Apex/Java
level: Basic
summary: Primary language used for Salesforce development
- name: Other Languages
level: Novice
summary: Powershell, Vbs, batch, PHP, Perl, C, C#
- name: AWS
level: Skilled
skills:
- name: Cloudformation
level: Skilled
summary: Both natively and via Troposphere, Jinja & ruby-DSL
- name: IAM
level: Skilled
summary: Extensive knowledge in building secured multi-tenanted accounts
- name: Lambda
level: Knowledgeable
summary: Used extensively with both Python & JavaScript as part of stand-alone components & complex frameworks
- name: DynamoDB
level: Knowledgeable
summary: Used in-depth to store state for lambda, whenever S3 was not viable
- name: Core AWS services
level: Skilled
summary: Have used EC2, S3, RDS, SNS, SQS, Cloudwatch, Config, ElastiCache, etc, extensively
- name: Additional AWS services
level: Knowledgeable
summary: Have used other services such as Direct Connect, CodeDeploy, AppStream, etc
- name: DevOps tools and methodologies
level: Skilled
skills:
- name: IaC
level: Skilled
summary: Always used Infrastructure as Code (IaC), primarily native tools, but also Terraform when needed
- name: Immutability
level: Skilled
summary: Designed and implemented immutable systems
- name: Configuration Management
level: Skilled
summary: Extensive use of configuration management tools such as Chef and Ansible when needed
- name: TDD - Infrastructure
level: skilled
summary: Used test driven development (TDD) on most greenfield projects and also gradually retrofitted to legacy infrastructure
- name: TDD - Application
level: Moderate
- name: Build systems
level: skilled
summary: Always used Build systems (Jenkins, AnthillPro, Code*, Gitlab, etc)
- name: Containers
level: Knowledge
summary: Familiar with immutable image pipelines and methodologies
- name: Init systems
level: Knowledgeable
summary: Used Upstart/SystemD/SysV/Monit as required, in particular use of signals/sockets when possible
- name: Agile
level: Moderate
summary: Used both within Small companies and adapted Larger Enterprise
- name: Google
level: Knowledgeable
skills:
- name: GCP
level: Moderate
summary: Developed high level account strategies for GCP
- name: Google deployment manager
level: Moderate
summary: Developed transformation tool using Google Deployment Manager
- name: Google App Engine
level: Moderate
summary: Some use as part of projects, primarily in Python
- name: Google Apps
level: Skilled
summary: Extensive use as part of Google Apps projects
- name: Salesforce
level: Moderate
skills:
- name: Apex
level: Moderate
summary: Experience writing both small classes and triggers
- name: Configuration
level: Basic
summary: Experience configuring Salesforce estates as per customer requirements
languages:
- language: English
fluency: Native
- language: Spanish
fluency: Native
interests:
- name: Open Source & Linux
- name: Travelling
- name: Punk Rock, Politics & the Pub
meta:
format: FRESH@1.0.0
version: 0.0.1

50
settings.yaml Normal file
View File

@ -0,0 +1,50 @@
---
primary:
html: elegant
md: Compact
pdf: Compact
generate:
resume:
plain:
- Positive
- Compact
- Modern
css:
fresh-theme-bootstrap:
- darkly
- superhero
- cerulean
kcv:
- flat
- modern
full:
plain:
- node_modules/fresh-theme-elegant
links:
main:
- name: PDF
url: resume.pdf
- name: markdown
url: https://git.juancanham.com/JuanCanham/juan-canham-resume/Resume.md
- name: yaml
url: resume.yaml
extra:
- group: styles
links:
- darkly
- Positive
- Compact
- Modern
- superhero
- cerulean
- kcv-flat
- kcv-modern
- group: formats
links:
- name: json
url: full.json
- name: json-resume
url: full.jrs.json

63
transform.py Executable file
View File

@ -0,0 +1,63 @@
#!/usr/bin/env python
"""
This script transforms a resume from a format container positions within an
employer, to valid FRESH resume format
"""
import os
import json
import copy
import yaml
# pylint: disable=missing-function-docstring
def main():
with open("resume.yaml") as file:
resume = yaml.load(file, Loader=yaml.BaseLoader)
transform_and_write("full", make_full, resume)
transform_and_write("resume", make_fresh, resume)
def write_file(name, data):
filename = os.path.join("dist", name + ".json")
with open(filename, "w") as file:
json.dump(data, file)
def transform_and_write(name, func, raw_data):
data = copy.deepcopy(raw_data)
data = func(data)
write_file(name, data)
def make_fresh(data):
data["skills"].setdefault("list", [])
for skillset in data["skills"]["sets"]:
skills = skillset["skills"][:]
skillset["skills"] = [skill["name"] for skill in skills]
data["skills"]["list"] += skills
return data
def make_full(data):
with open("settings.yaml") as file:
data["settings"] = yaml.load(file, Loader=yaml.BaseLoader)
tags = data.get("tags", [])
for employer in data["employment"]["history"]:
tags += employer.get("technologies", [])
tags += employer.get("keywords", [])
for project in employer.get("projects", []):
tags += project.get("technologies", [])
tags += project.get("keywords", [])
tags = sorted(tags, key=tags.count, reverse=True)
data["tags"] = []
for tag in tags:
if tag not in data["tags"]:
data["tags"].append(tag)
return data
if __name__ == "__main__":
main()