When you think about it, most of the tasks you carry on in a
DevOps environment revolve around
automation.
Infrastructure as Code? It’s automation, right? Cool.
Testing? As automated as possible. Integration with 3rd party systems
(SonarQube, WhiteSource, Azure DevOps, Jira, anything you retrieve data from or
send data to) – it is automated.
You might think I live in a DevOps bubble. Let’s expand then
– SRE? Data processing? How many things we rely on in modern development that
have automation to their core? Excellent, you got your answer.
Mind that, it is not a Cloud exclusive. At the end of the
day it’s technology we are talking about – hence my mantra “A technology problem is never really a problem.
Technology can be bent at will.”
Now, I am working upon a client at the moment which has many
efforts going on, and I was asked if I know of a way of automating deployment
of Azure Machine Learning Experiments from Azure ML Studio.
Being completely oblivious to the technology I spent some
time on it, and despite being a machine learning tool it is quite WYSIWYG.
Hence no automation whatsoever, at the moment. Being a cloud-based product there
isn’t the shadow of a doubt that eventually it will be implemented (given
enough demand, obviously), but it is manual today. And I need to do this today,
so there was no other option than going down a custom route.
Let’s pretend for a minute that I am not living under a rock,
I don’t know how to use a search engine. The first step here would be to fire
up Fiddler and see what happens when you press your buttons in the tool’s UI.
It is a modern web-based tool so there is communication
between the UI and some API layer. There will be some sort of authentication
involved. You can work your way around it. Eventually, you will be able to
replicate this interaction with a script, and put it in your pipeline.
Given I do not live under a rock and I know how to use a
browser the first search result would be the excellent Azure ML PS. Using it in a set
of PowerShell scripts, stored in a Git repository, to be then consumed by an Azure PowerShell task in Azure
DevOps Pipelines is really trivial. Again, this is a valid proposition for both
on-premise systems as well as cloud-based systems.
Pipelines are brilliant for this – and I mean it. If you
have something ready, or something that cannot use a task, just throw it within
a PowerShell or a Bash script and you are in business. Use a Windows agent or a
Unix agent, I don’t really care to be fair – as long as you can interact with
the target system with a script everything is good in my view.
Sure, there will be situations where automating gets really
difficult. Sometimes you can’t avoid doing manual things. Some other times you
will be better off rewriting the whole thing. But usually, we are in a quest
for continuous automation and it is what keeps us going. Keep automating!
No comments:
Post a Comment