Categories
Computing

A Note on Connecting GitHub Webhooks with Jenkins in AWS

Jenkins is a potential CI/CD solution for my company. I’ll admit that I have it in for Azure DevOps. No one really understands it well. The person who set it up has left the company. Jenkins is likely the leader in this space.

As part of this effort, I wanted to explore GitHub Webhooks as automatic triggers for builds in Jenkins.

There are many excellent resources online for helping with setting up that Integration. Here is one. Many resources appropriately assume connectivity between a Git repo and Jenkins. That connectivity is not necessarily a given so I thought to share some issues I ran into with connectivity between my personal public GitHub repo and Jenkins box on AWS.

Using a public repo does simplify the GitHub side of the equation. In my case for the AWS side, the Jenkins server has an elastic IP that is protected by a Security Group. Only traffic originating from specific IPs is allowed, so one needs to find a way to accept only those packets that belong! The rub here is that we want to allow the traffic coming from GitHub, which was denied by design.

GitHub actually makes this fairly easy by publishing their IPs here. It simply became a matter of setting up (in my case) a dedicated Security Group to stipulate the IPs. GitHub also makes it fairly easy in the UI to test delivery and redelivery of Webhooks on the Webhooks page.

As you can see from the image above, one simply needs to click on the redelivery button to understand whether connectivity was achieved. In my case, I kept increasing restrictions in the Security Group and re-testing delivery noting the successful connection.

Sidenote – Binding GitHub and Jenkins

One interesting configuration item of note is that GitHub Webhooks point to a Jenkins server followed by /github-webhook/. They do not point directly to the URL of the Jenkins Project that will do the build. Note the Payload URL below.

In Jenkins, you do stipulate the repo to which the Project will be bound as shown below. (Those two are shown below. )There is additional configuration needed in Jenkins, but that’s beyond the scope of this article and may be found here.

Photo by Edson Rosas on Unsplash

Categories
Computing

Passing the AWS Associate Exam

Looking for advice on passing an AWS Architecture Associate Exam? I had been looking for that advice very recently. Fortunately, there are plenty of good resources out there to help. (And yes, I did pass the exam recently.)

Advice isn’t in the form of exam answers. Well, it is if you’re willing to dig in.

I followed four tiers of study. You don’t need all four. You could do only Tier 1 and likely do great on the exam provided you dig into the tests as recommended.

Tier 1

  • linuxacademy.com — after each module, take the quizzes. Once you’ve worked through everything,  take the exam either timed or not. The questions are NOT the questions on the AWS Exam, but they are really helpful. The most important thing you can do is dig into all the answers, right or wrong. Two answers are often reasonably good with the detail of one being incorrect with the detail of another being correct. Those details are important for the actual exam! Understand why the wrong answers are wrong. You may not see the question on the actual exam, but digging into understanding why a certain solution for a problem is not right can help you understand when it is the appropriate choice. Lastly, there is no substitute for taking the timed exams — even in the comfort of your home, you feel a little pressure — that’s a good thing to do.

Tier 2

  • acloud.guru — they also have an exam simulator. I found their courses to be just as good as Linux Academy’s course. The exams were good too. I used their course a little less than Linux Academy.
  • pluralsight.com (Elias Khnaser’s courses)

Tier 3

Two excellent free videos about VPC from re:Invent. (Yes, you can see a LOT from re:Invent without going.)

Both presentations are really insightful. In both cases, the presenters build your knowledge from the ground up. You could watch either one and walk away with a good understanding. I really do like both, and they are well worth your time. I listened to both several times during commutes — not ideal, but helpful nonetheless.

  • Amazon does guide you, to a great extent, in their book AWS Certified Solutions Architect Official Study Guide available at Amazon.Before purchasing the Official Study Guide, read Casey Hendley’s review of the book on Amazon’s site. The book is a little dated given, as Casey says, that “Amazon changes things on AWS at a frightening pace”. I would not use this book alone to pass the exam.

Tier 4 – Immersion

If you can afford it, attend re:Invent or a Summit. I’ve done both multiple times. Immersing yourself in the domain is worthwhile, and if you’re a working professional, hard to do on your own. I burn a week of vacation for re:Invent, spend the money and immerse!

Amazon offered a free HA course in NYC. So, I went and took it. Had a great time both with the course and in the city.

I live in Chicago, so I attended the Chicago Summits in 2016 and 2017.

Washington, DC is awesome, so I went to the Public Sector Summit. Don’t work in the public sector — didn’t care.

Went to re:Invent in Vegas twice. Those are more expensive efforts, but for me, well worth it.

 

(Feature Photo by Ben White on Unsplash)

 

Categories
Computing

AWS SWF Responsibility Patterns

Note: this is a conceptual, “how does the SWF design pattern work” article, rather than a “how-to”. In fact, implementation does not need to be Amazon specific, although SWF supports the pattern.

A Naive, Simple Code Example

We may write fairly complex algorithms to fulfill the needs of a complex process. One could write a single function that coordinates an activity such as processing an order. Our naive pseudo-code:

function processOrder(var orderInformation) {
    bool customer = verifyCustomerInfo(orderInformation);
    bool inventory = verifyInventory(orderInformation);
    bool payment = verifyCreditCard(orderInformation);
    bool shipment = scheduleShipment(orderInformation);
    bool notification = notify(orderInformation);
}

Obviously, we would need to evaluate each step and apply logic along the way. Our processOrder function will quickly become fairly complex as we apply decision logic and have many nested “if this then that”.

In this example, Coordination and Decision are intermingled in the same function. As the complexity of possible outcomes increases, so does the code.

SWF Design Pattern

AWS SWF can help with coordination patterns such as the one above by giving certain elements very specific duties to perform, and by having those elements perform highly cohesive responsibilities of Coordination, Decision and Work.

Main elements of SWF:

  • SWF (Coordinator)
  • Decider
  • Activity Workers

Workflow Starter initiates the workflow lifecycle. From there, the remaining three elements perform specific duties in a cyclical pattern of SWF -> Decider -> SWF -> Activity Worker.

A generic pattern looks like this:

  • SWF receives initiation, updates workflow history and schedules a decision task.
  • Decider receives the task, evaluates history and decides the next Activity.
  • SWF receives decision, then schedules an Activity, and waits for the completion of that Activity.
  • The Activity Worker receives the task, executes it and returns to SWF.
  • SWF receives the result, updates workflow history and schedules a decision task.

In the initial naive code example above, the processOrder function appears to have two responsibilities: Coordination and Decision. SWF will allow us to segregate those activities and have highly cohesive responsibilities.

SWF Conceptual Takeaways

  • The cycle occurring n times is: SWF -> Decider -> SWF -> Activity Worker.
  • Workflow History plays an important role and is recorded by SWF.
  • Given Workflow History, we can amend this cyclic pattern in the following way:
    • SWF (log history)
    • Decider
    • SWF (schedule activity)
    • Activity Worker
  • Updating History allows the Decider to decide appropriately.
  • Each element receives. This implies sharing or passing of data/state.
  • SWF itself, is a coordinator, and will NOT perform the work itself. It either calls the Decider or an Activity Worker.
  • Activity Workers perform work.
  • An Activity Worker is called by and returns results to SWF.
  • Deciders decide which Activity is next, but the Decider does NOT call an Activity Worker directly.
  • Deciders review history, evaluate it and decide the next Activity.
  • The Decider can close a Workflow.