CategoryGit

How I used Travis CI to deploy Barfer on Azure

Ok this time I want to talk a little bit about how I used Travis CI to deploy Barfer on Azure. Last time I mentioned how much helped having a Continuous Delivery system up and running so I thought it would be worth expanding a little bit the concept.

Some of you may say: “Azure already has a continuous delivery facility, why using another service?”. Well there’s a bunch of reasons why:

  • when I started writing Barfer I had no idea I was going to deploy on Azure
  • in case I move to another hosting service, I’ll just have to change a little bit the deployment configuration
  • CD on Azure is clunky and to be honest I still don’t know if I like Kudu.

Let me spend a couple of words on the last point. Initially I wanted to deploy all the services on a Linux web-app. Keep in mind that I wrote Barfer using Visual Studio Code on a Macbook Pro. So I linked the GitHub repo to the Azure Web App and watched my commits being transformed into Releases.

Well, turns out that Kudu on Linux is not exactly friendly. Also, after the first couple of commits it was not able to delete some hidden files in the node_modules folder. Don’t ask me why.

I spent almost a week banging my head on that then at the end I did two things:

  1. moved to a Windows Web-App
  2. dropped Azure CD and moved to Travis CI

Consider also that I had to deploy 2 APIs, 1 web client and 1 background service (the RabbitMQ subscriber) and to be honest I have absolutely no desire of learning how to write a complex deployment script. I want tools to help me doing the job, not blocking my way.

The Travis CI interface is  very straightforward: once logged in, link the account to your GitHub one and pick the repository you want to deploy. Then all you have to do is create a .yml script with all the steps you want to perform and you’re done.

Mine is quite simple: since I am currently deploying all the services together (even though each one has its own independent destination), the configuration I wrote makes use of 5 Build Stages. The first one runs all the unit and integration tests then for every project there’s a custom script that

  1. downloads the node packages (or fetches them from the cache)
  2. builds the sources
  3. deletes the unnecessary files
  4. pushes all the changes to Azure

The whole process takes approx 10 minutes to run due to the fact that for every commit all the projects will be deployed, regardless where the actual changes are. I have to dig deeper into some features like tags or special git branches, but I will probably just split the repo, one per project. I just have to find the right way to manage the shared code. 

#hashtags just landed on #Barfer!

Yeah I know, I blogged yesterday. I probably have too much spare time these days (my wife is abroad for a while) and Barfer has become some kind of obsession.

You don’t know what Barfer is? Well go immediately check my last article. Don’t worry, I’ll wait here.

So the new shiny things are #hashtags! Yeah, exactly: now you can barf adding your favourite #hashes and you can even use them to filter #barfs!

The implementation is for now very simple, just a string array containing the tags, as you can see from the Barf interface defined here.

The command handle responsible for storing the Barf uses a regex to parse the text and extract all the tags (apart from checking for xss but that’s another story).

On the UI then before getting rendered every Barf goes through the same regex but this time the #tag is replaced with a link to the archive.

Quick&dirty.

Next thing in line would be adding some analytics to them but that would require a definitely bigger community 😀

I also went through some small refactoring and cleaning of the frontend code, although I will probably move to an SPA sooner or later. Thing is, I’m still not sure if using React, Angular or Vue so in the meantime I’m focusing on the backend.

There are so many features I would like to add that to be honest I prefer to not focus on the frontend for now. Maybe I’ll start looking for somebody who helps me on that.

One thing I’m quite happy for now but I plan to rework is CI/CD. Well for now I’m working alone on this so probably I can’t really talk about integration. But whatever.
As I wrote already, I’m using Travis CI and I’m very happy with the platform. Even though I’m still on the free tier, the features are awesome and flexibility is huge.  I’ll probably write a blog post on this in the next few days.

In the meanwhile, #happy #barfing! 

I’m becoming a Barfer!

More than a month! My last post on this blog was more than one month ago. I should write more often. No wait let me rephrase that: I should write on this blog more often.

Why? How I spent my last month? Barfing, here’s how!

Ok, let’s add more details. A while ago I decided it was a good idea starting to move a little bit away from the .NET world and explore what’s around. And  NodeJs arrived, of course with Typescript: I am 100% sure it’s basically impossible to write a semi-reliable system without some kind of type checking (along with 10000 other things). 

Then I said: “I don’t want to just read a manual, what can I write with it?”. For some crazy reason I opted for a Twitter clone. Yeah, I was really bored.

Early in the analysis phase RabbitMQ and MongoDb joined the party. I was not exactly familiar with RabbitMQ so I thought was a good opportunity to learn something new.

In order to speedup the development and to obtain certain features (eg. authentication ) I have used a bunch of third party services

The system uses a microservices architecture with a front-end that act as api-gateway. For each service I’ve taken the CQRS path along with Publish/Subscribe. An Azure WebJob is scheduled to run continuously and listen to the various events/messages on the queues. I’ll blog more about the architecture but that’s it more or less.

What am I expecting from this? Well it’s simple: nothing. I mean, I’m not expecting people to use it (even though would be very nice), I am just exploring new possibilities. Nothing more.

Oh yeah, before I forgot: https://barfer.azurewebsites.net/ . Enjoy!

Yet another “How to use SASS with WordPress” guide

Yes, it’s another one. If you lookup on Google there are are tons of articles about how to use SASS in a WordPress theme, so why writing another one?

Well, the answer is simple. Because I can. Because I am bored. Because I’m going to give you the sources with no fuss.

First of all, take a look at this repo.

As you can easily notice, it contains part of the standard WordPress folder structure and a bunch of other files. And trust me, I am not that kind of guy who adds files for nothing.

The main idea here is to have Gulp search and watch for our .scss files in the child theme folder and build the final style.css files every time something changes. Nice isn’t it?

Before we start we need of course to install some dependencies. Fire up the Terminal and run:

sudo npm install -g gulp

just to make sure we have Gulp installed globally (that’s why you need sudo for that). Then run:

npm install gulp gulp-sass gulp-clean gulp-autoprefixer –save-dev

 We’ll discuss about those packages later.

 
I have added a “sass” folder inside “twentysixteen-child” that contains all our SASS files.

The style.scss file is our main entry point and as you can see from the repo, contains all the boilerplate code required by WordPress to discover the child theme.

I tend to include a _base.scss file that contains all the basic dependencies like variables and mixins. Then in style.scss I append all the page templates, like _home.scss in our small example.

 

 

 

Now let’s talk about the Gulp configuration file. The first lines contain the dependencies we need in our tasks, gulp, sass, clean and autoprefixer (more on this later).

Then we have the paths we want our SASS compiler to run on. As you can see I am using the child theme path as a base concatenated to the others.

After this we can start with the tasks. The first one is responsible of removing all the files from the previous build (basically just one, style.css ).

Then we have the actual SASS compilation. I am passing to the sass() function an empty configuration object, but there are several options available, for example you may want to compress the result.

The “postprocess” task is responsible of every post-compilation step we may want to perform on the output css file. In our case, I am using a very useful library named Autoprefixer that adds all the vendor-specific prefixes. If you’re interested, there’s a nice article on CSS-tricks.com, you can read it here.

The last bit is the “watch” task. This is interesting: basically it tells Gulp to monitor our /sass/ folder and every time there’s a change, to run again the “build” task. That’s it!

Now all you have to do, if you’re using Visual Studio Code like me, is to hit cmd+shift+p  and type “Configure Task Runner”:

then pick Gulp as your default Task Runner. If you take a look at the tasks.json file in the repo, you will notice that I have added some more custom configuration just to instruct VS Code to use the “default” task as main entrypoint.

That’s it!

How I almost lost all my source codes.

Now sit down my dear and listen carefully, I’ll tell you a story about how I almost lost all my sources.
A while ago, I decided to give my marvelous Macbook pro mid-2013 an upgrade. I searched online a little bit and at the end I bought an SSD drive, a Corsair Force LE 240GB

“But 240 is not enough!” you might say.  “You’re right”. It’s not enough. 

I was not using the DVD drive at all so after a brief research, I found the right adapter and replaced it with the old 500gb Apple disk , leaving space for my shiny new SSD.

Everything was perfect, El Capitan was lightning fast, everybody was happy. But then came the day that I needed Windows. So Bootcamp joined us and new partitions started to appear.

180GB OSX Extended and 60GB NTFS on the SSD.
450GB OSX Extended and 50GB exFAT on the ol’ Apple disk.

Again, everything was perfect, El Capitan was still lightning fast, Windows 10 was running fine, everybody was happy.

I was running Windows from the SSD and all the programs were installed on the other drive, together with all the source codes. Yes, before you ask, I have a Bitbucket account. Yeah, a Github one too, but Bitbucket gives you private repos for free.

However, after a while, I realized that when Win10 goes to sleep mode some strange misbehavior appears, in the form of weird SMART messages when turning on MacOs.

Long story short, one day I rebooted from Win to MacOs and puff! the partition with all the sources was gone. Disappeared. An empty, dark and cold space.
I almost got an heart attack.

Disk Util, Disk Warrior, mysterious command line tools, I tried everything, nothing worked. After hours of researches and curses, I fired up Windows and did the only thing I had left:

chkdsk e: /f

That saved my day.

Moral of the story? Always backup your source codes, even the most insignificant snippets.

© 2017 Davide Guida

Theme by Anders NorenUp ↑