As part of the AI 1.0 surge (1983–1987), I felt that AI, to be of practical use, had to be distributed. Since then, I have been building distributed operating systems.
I admit I stopped when I encountered Kubernetes in 2016 because from a Software Architect and Engineer viewpoint, there was little I could add, and what I could add would be minor.
Yes, I think Kubernetes is a great distributed operating system architecture for a cloud of virtual machines (VMs) to a hive of Rasberry Pis.
I returned to Machine Learning about nine years ago during the AI 2.0 …
I was taught to transform a complex problem into a simpler problem, by dividing the problem into smaller sub-problems.
I made the process paradigm of Machine Learning Operations (MLOps) simpler by dividing it into five different, but overlapping process groups or Operations (Ops) groups.
MLOps is automating the Machine Learning product life cycle.
In 1990, we had 80%+ IT projects that were never rolled out. Failure rate dropped because of standardized developer tools, repeatable process iteration, death of the Waterfall method, a rise of the Agile method, and unit testing — to list some of the code development advances.
I plan to increase stability and performance and to decrease the cost of maintenance in the Photonai code base. I add clustering functionality to the original code base and change the architecture.
A small code refactoring task is usually fixing bugs. Some consider that it is not refactoring if the bug-fixing occurs before releasing to test.
A significant code refactoring project example is causing a program to be Y2K-compliant but not changing functionality. Y2K compliance is enabling code to operate correctly with dates at or beyond January 1, 2000. (Yes, this was a thing!)
Python is a dynamically typed language. However, starting with Python 3.5 (PEP 484), type hints were introduced. Type hints (note: not strong type checking) make it possible, post coding of Python, to do static type checking of code.
Here’s a great figure showing the evolution of Python type hinting:
The rendering of high-quality architecture diagrams of Azure, AWS, and GCP is shown using the Python package Diagrams. Diagrams depend on the Graphviz runtime. This article shows step-by-step how to create a Docker image with Diagrams and Graphviz. All code is included and can be downloaded.
I have posted several articles on how to create development and test Docker images [see references 4, 5, and 6 below]. I assume you know of Docker and have read them.
Docker is used for encapsulating an individual image of your application.
Docker-Compose is used to manage several images at the same time for…
You don’t have relationship problems.
When you talk to a female friend and
Note: I realize that even if you are old, you may still try to offer advice.
When talking to a male friend and
You leave your binoculars at home when you go to the beach.
You are in Las Vegas and go to bed
Estimates state that 70%–85% of the world’s data is text (unstructured data) . As a result, new deep learning language models (transformers) have caused explosive growth in industry applications [5,6.11].
This blog is not an article introducing you to Natural Language Processing. Instead, it assumes you are familiar with noise reduction and normalization of text. It covers text preprocessing up to producing tokens and lemmas from the text.
We stop at feeding the sequence of tokens into a Natural Language model.
The feeding of that sequence of tokens into a Natural Language model to accomplish a specific model task is…
With these thirty concepts and sixty gap-filling tips, you are well-prepared for the edX AWS Developer course. Obtaining a pass and awarded a certificate is straightforward.
I go through the AWS (Amazon Web Services) Developer Certification EdX course . I define the terms and concepts of the course. Also, I add additional explanations and tips that may help you.
I use it as my filter: “I wish I had known that” or “I wish that had been emphasized.” I do not expand on material that I think is adequately covered.
I recommend the following steps:
GPT-3 is a good start for human-like natural language performance. Perhaps a better analogy might be the “Homo habilis”  of Artificial General Intelligence Natural Language.
The Homo habilis species distinguish itself from the earlier Australopithecus group with a slightly larger braincase and smaller face and teeth. However, the Homo habilis species retained some ape-like features .
Most importantly, for the analogy I will make, the Homo habilis species is thought to be the first maker of stone tools .
In my analogy, GPT-3 represents the Homo habilis of the Artificial Intelligence of Natural Language. The following is my reasoning:
As we continue to develop machine learning Operations (MLOps), we need to think of machine learning (ML) development and deployment flow as a Directed Acyclic Graph (DAG).
DAG is a scary acronym, but so are LTSM, DNN, backward propagation, GAN, transformer, and many others.
I think using “pipeline” is wrong.
The problem with “pipeline” is that it is slang. But worse, it connotes a linear process of steps.
I can assure you the human brain is not a “pipeline.”
Note: I can’t think of any mammals as having a “pipeline.” Do some insects and dinosaurs have a pipeline of ganglia?