I caught up the making virus from our engineers, and, jointly with a fellow parent, volunteered to teach an “Art of Making” seminar at my daughter’s school (my friend’s son came up with the title). Our hope was to bring STEM, art, and design thinking together. (I am supposed to represent art and design thinking, but secretly dream of learning some STEM along the way).
There are about thirty different seminars that (more...)
This is how I dump data from an Oracle Database (tested on 8i,9i,10g,11g,12c) to a delimited ascii file:
SQL*Plus: Release 126.96.36.199.0 Production on Fri Feb 24 13:55:47 2017
Copyright (c) 1982, 2014, Oracle. All rights reserved.
Oracle Database 12c Standard Edition Release 188.8.131.52.0 - 64bit Production
SQL> set timing on
SQL> select Dump_Delimited('select * from all_objects', 'all_objects.csv') nr_rows from dual;
Oracle dbms_datapump provides a parallel option for exports and imports, but some objects cannot be processed in this mode. In a migration project from AIX 11gR2 to ODA X5-2 ( OL 5.9 ) 12c that included an initial load for Golden Gate, I had to deal with one of those objects, a 600G table with LOB fields, stored in the database as Basic Files ( = traditional LOB storage ).
Op donderdag 16 maart vindt de tweede AMIS Tools Showroom Sessie plaats. De eerste sessie was op 13 december: hierin hebben 16 AMIS-ers in korte en hele korte presentaties en demonstraties allerlei handige tools en hulpmiddelen laten zien. De nadruk in deze sessie lag op tools voor monitoring, communicatie en collaboration.
In deze tweede sessie gaan we op zoek naar nog (more...)
The title suggests a negative statement of using a Public Cloud. Well, it isn’t. I’m convinced the Cloud is the next best thing, with huge advantages for businesses. But companies should be aware of what they choose. A lot of providers, including Oracle, are pushing us to the cloud, Public, Private or Hybrid. And make us believe that a public cloud will be an inevitable extension to our on-premises environment. Moving your weblogic and (more...)
This article describes a simple Node.js application that uses Server Sent Events (SSE) technology to push updates to a simple HTML client, served through the Express framework. The updates originate from messages consumed from a Kafka Topic. Although the approach outlined in this article stands on its own, and does not even depend on Apache Kafka, it also forms the next step in a series of articles that describe an Kafka Streams application that (more...)
I attended the first Dublin Tech Summit (@DTS) and have to
say I was very impressed by the depth and breadth of the event, and even
learned new things, renewed old acquaintances and forged new business
DTS offered a powerful schedule covering
from Fashion and Healthcare to IoT and FinTech and more for an audience ranging
from storytellers to investors to coders, and was combined with workshops and a
platform for startups as well (more...)
Blogging and long-form content seem so tedious to me nowadays, but if you’ve read here for a while, you’ll recall that I used to post several times a week.
One of the reasons I’ve kept this blog running in the era of ever-shorter content is that it keeps a historical record of this team’s work and our thoughts. As an emerging technologies team, we know that not all our thinking will make it to daylight, so we (more...)
Let’s clear about this: Oracle Management Cloud (OMC) is NOT a replacement of Oracle Enterprise Manager Cloud Control (OEM CC) or even an equivalant. Rumours are that this will be Oracle’s policy in a far away future, but in the meantime we focus on what they do best. OEM CC is a product for a complete management solution for your Oracle environment, OMC for monitoring and, most of all, analyse the monitored data in your (more...)
This article explains how to implement a streaming analytics application using Kafka Streams that performs a running Top N analysis on a Kafka Topic and produces the results to another Kafka Topic. Visualized, this looks like this:
Kafka Streams is a light weight Java library for creating advanced streaming applications on top of Apache Kafka Topics. Kafka Streams provides easy to use constructs that allow quick and almost declarative composition by Java developers of streaming pipelines that do running aggregates, real time filtering, time windows, joining of streams. Results from the streaming analysis can easily be published to Kafka Topic or to external destinations. Despite the close integration with Kafka and the (more...)
The AMIS SIG session on Apache Kafka (9th February 2017) took 25 participants by the hand on a tour of Apache Kafka. Through presentations, demonstrations and a hands-on workshop, we provided a feet-hitting-the-ground-running introduction to Apache Kafka and Kafka Streams as bonus. Responsible for this workshop are Maarten Smeets and Lucas Jellema.
In a recent article I described how to implement a simple Node.JS program that reads and processes records from a delimiter separated file. That is stepping stone on the way to my real goal: publish a load of messages on a Kafka Topic, based on records in a file, and semi-randomly spread over time.
In this article I will use the stepping stone and extend it:
Frequently, there is a need to read data from a file, process it and route it onwards. In my case, the objective was to produce messages on a Kafka Topic. However, regardless of the objective, the basic steps of reading the file and processing its contents are required often. In this article I show the very basic steps with Node.js and and the Node module csv-parse.
Oracle Service Bus is a powerful tool to provide features like transformation, throttling, virtualization of messages coming from different sources. There is a (recently opensourced!) Kafka transport available for Oracle Service Bus (see here). Oracle Service Bus can thus be used to do all kinds of interesting things to messages coming from Kafka topics. You can then produce altered messages to other Kafka topics and create a decoupled processing chain. In this (more...)