Django describes itself as "the web framework for perfectionists with deadlines". It was designed to help Python developers take applications from concept to completion as quickly as possible. Learn advantages and disadvantages of Django.
AWS provides the most comprehensive, secure, and cost-effective portfolio of services for every step of building a data lake and analytics architecture. These services include data migration, cloud infrastructure, management tools, analytics services, visualization tools, and machine learning. In this post we analyze the available solutions.
New data-driven apps, data lake architectures, products, and services create more data that can be stored and managed in the cloud, which allows organizations to develop new capabilities and apps, gain new insights, and deliver new products. Presented strategy is a step-wise, repeatable process, which must be run project by project, like turning a flywheel, building momentum with each turn.
This year EuroPython featured 120 talks and 6 parallel tracks on almost every topic imaginable, including some that we wouldn’t have been able to imagine. In this post, I want to share with you my favorite 10 talks that I believe can make you a better Python developer.
Brace yourselves. Serverless is coming! - You must have heard the term serverless. It is 2019, and everyone in IT is buzzword driven. Many of the companies are actively participating in or seriously considering taking part in the serverless revolution. Is it worth it? What is in it for you? I would like to focus on not-so-obvious advantages that the technology brings to the table - in particular, business-wise.
The most challenging part is to make such a guideline actionable and the least opinionated as you can. My goal is to present and identify critical factors which should restrain you from migrating off AWS just because of the newer, shinier option. However, before we present the guides, I would like to start with the rectification.
If you have been paying attention to the tech industry lately, you would have noticed a surprising trend, large enterprises are moving away from Amazon Web Service (AWS) over to Google Cloud Platform (GCP). Some of the faults can be squarely laid on Amazon’s front-door but others can be attributed to the competitive advantage GCP has gotten over AWS.
In times when computer programing is becoming more and more accessible due to the growing number of coding schools, online resources and bootcamps, this question seems to go viral - which computer language should I learn first or which language should I choose for my use case. This situation is no different for Ruby and Python.
This article aims at reminding the veterans and aiding the beginners in the basics of caching (in Ruby particularly). Starting from the basic question – what is caching – we will move on to when given caching techniques should be used, where cached resources can be stored, what types of cache are available, how to approach cache invalidation, and finally, what risks are imposed.
Nowadays any project must be preceded by a detailed Customer Profiling process in order to provide detailed and accurate information concerning customer’s needs and expectations. The right decision, which carries the probability of success has to be a customer-centered one, with possibly all these needs and expectations to be met.
Even a casual look at the brief Wikipedia (translated) definition proves the fallacy of such thinking: “Natural Language processing (NLP) – the interdisciplinary scientific field which joins the issues of natural intelligence and linguistics, dealing with the automatization of language analysis, understanding, translation and generating the natural language by computer.”
As old as it is, we still have the debate: what kind of database should be used for my system? The most common answer is usually: “it depends”. We know that it does depend on many different factors. Therefore I would like to cover some of them, which can help you in identifying and selecting the proper one, based on the requirements for your project.
„Our experience has taught us that if your organization hasn’t created and thoroughly tested, repeatedly, a cyber incident response plan across all business areas and personnel, as well as performed simulations of cyber attacks, you won’t do a good job of responding when it occurs for real. We see over and over that it is very difficult to make good decisions when you’re responding to a real attack in the heat of the moment.” /David Burg, Cyber Security & Privacy Leader PwC/
It goes without saying that for the last decades a vast majority of institutions, companies, firms and the like, have dealt with the Big Data reality, which required or just forced the urgent necessity to create processing platforms capable of storing and analyzing this vast amount of data. Here is why Hadoop and [Spark](/spark-consulting/), later on, around the year 2008, came into picture.
High-volume data streams and a great number of reports for the real estate market was what we were confronted with on one of our client’s projects. More specifically, the client faced a tough scalability problem: the property market reports generated from such a big data set took up to 3 hours to produce (just for 100 markets). Worse, this time was increasing as each day a few million new records were fed to augment the data set. In a step to resolve the problem, the client decided to invest in a new system architecture.
The post features an account of a machine learning enabled software project in the domain of financial investments optimization / automation in blockchain-based cryptocurrency markets. The article specifies the domain problem addressed as well as describes the solution development process and the key project takeaways.
The IoT Working Group of the Eclipse Foundation is working on a few IoT-related software projects and the effort is backed up by some such players as Red Hat, Eurotech, Bosch Software Innovations, GE Digital. The goal is to provide an ecosystem of interoperable IIoT platform components with which to efficiently design and implement
We have recently had an opportunity to design and develop an independent machine learning-based service for a social publishing and e-learning platform. The client needed to build a service that would deliver a recommendation system with automatic content classification. In this post I would like to share some background to the work as well
Red Hat recently announced that the next release of Eclipse Kapua IoT backend platform will be distributed not only as jar, but also as a set Docker images. In particular Red Hat plans dockerized version of Kapua to run smoothly in the OpenShift Origin PaaS. Eclipse Kapua is a modular IoT cloud platform to manage and
A common scenario for mobile IoT Gateways, for example those mounted on trucks or other vehicles, is to cache the data collected locally on the device storage and synchronize the data with the data center only when a trusted WiFi access point is available near the gateway. Such a trusted WiFi network could be
Which GPS unit to choose? There is a myriad of GPS receivers available in the market. BU353 is one of the most popular as well as one of the least expensive GPS units. It can be connected to the computer device via a USB port. If you are looking for a fairly good
The key part of the process of tailoring a perfect IoT solution is choosing proper hardware for the gateway device. In general, the more expensive the gateway hardware is, the more messages per second you can process. On the other end, the more expensive the gateway device is, the less affordable your IoT solution
Adding new components at runtime Apache Camel relies on the components to provide connectors which can be used to consume messages from the various endpoints as well as send messages to the endpoints. For example, a gateway based on Apache Camel could use a/the Paho MQTT component to consume control commands from the
Apache Camel 2.16 brings Paho component which provides a connector for the MQTT messaging protocol using the Eclipse Paho library. Paho is one of the most popular MQTT libraries, so if you would like to integrate it with your Java project – Camel Paho connector is a way to go. How can I use