Art of Making

Art of Making

I caught up the making virus from our engineers, and, jointly with a fellow parent, volunteered to teach an “Art of Making” seminar at my daughter’s school (my friend’s son came up with the title). Our hope was to bring STEM, art, and design thinking together. (I am supposed to represent art and design thinking, but secretly dream of learning some STEM along the way).

There are about thirty different seminars that (more...)

Dump Oracle data into a delimited ascii file with PL/SQL

This is how I dump data from an Oracle Database (tested on 8i,9i,10g,11g,12c) to a delimited ascii file:

SQL*Plus: Release 12.1.0.2.0 Production on Fri Feb 24 13:55:47 2017
Copyright (c) 1982, 2014, Oracle.  All rights reserved.

Connected to:
Oracle Database 12c Standard Edition Release 12.1.0.2.0 - 64bit Production

SQL> set timing on
SQL> select Dump_Delimited('select * from all_objects', 'all_objects.csv') nr_rows from dual;

   NR_ROWS
----------
     97116

Elapsed:  (more...)

DIY Parallelization with Oracle DBMS_DATAPUMP

Oracle dbms_datapump provides a parallel option for exports and imports, but some objects cannot be processed in this mode. In a migration project from AIX 11gR2 to ODA X5-2 ( OL 5.9 ) 12c that included an initial load for Golden Gate, I had to deal with one of those objects, a 600G table with LOB fields, stored in the database as Basic Files ( = traditional LOB storage ).

By applying some DIY (more...)

AMIS Tools Showroom – The Sequel – Donderdag 16 maart 2017

Donderdag 16 maart

17.00-21.00 uur

AMIS, Nieuwegein

Aanmelden via: bu.om@amis.nl

Op donderdag 16 maart vindt de tweede AMIS Tools Showroom Sessie plaats. De eerste sessie was op 13 december: hierin hebben 16 AMIS-ers in korte en hele korte presentaties en demonstraties allerlei handige tools en hulpmiddelen laten zien. De nadruk in deze sessie lag op tools voor monitoring, communicatie en collaboration.

In deze tweede sessie gaan we op zoek naar nog (more...)

Duolingo: A UX Storyteller’s Dream

I really love the language learning app Duolingo. I’ve been a big fan of it since early days and have a few new efforts underway with it! I now use it for Irish, Dutch, Spanish and Italian.

Exploring Spanish on Duolingo. I think I will wait until a get a bit more fluent before sharing on LinkedIn. Social is a key part of the experience.

I’m especially thrilled that my own language, Irish (Gaeilge (more...)

Node.js application writing to MongoDB – Kafka Streams findings read from Kafka Topic written to MongoDB from Node

MongoDB is a popular, light weight, highly scalable, very fast and easy to use NoSQL document database. Written in C++, working with JSON documents (stored in binary format BSON), processing JavaScript commands using the V8 engine, MongoDB easily ties in into many different languages and platforms, one of which is Node.JS. In this article, I describe first of all how a very simple interaction between Node.JS and MongoDB can be implemented.

 

image

Then (more...)

Public Cloud consequences with an Oracle environment

The title suggests a negative statement of using a Public Cloud. Well, it isn’t.  I’m convinced the Cloud is the next best thing, with huge advantages for businesses. But companies should be aware of what they choose. A lot of providers, including Oracle, are pushing us to the cloud, Public, Private or Hybrid. And make us believe that a public cloud will be an inevitable extension to our on-premises environment. Moving your weblogic and (more...)

Node.js application using SSE (Server Sent Events) to push updates (read from Kafka Topic) to simple HTML client application

This article describes a simple Node.js application that uses Server Sent Events (SSE) technology to push updates to a simple HTML client, served through the Express framework. The updates originate from messages consumed from a Kafka Topic. Although the approach outlined in this article stands on its own, and does not even depend on Apache Kafka, it also forms the next step in a series of articles that describe an Kafka Streams application that (more...)

Nerds Get Sexy at the Dublin Tech Support

I attended the first Dublin Tech Summit (@DTS) and have to say I was very impressed by the depth and breadth of the event, and even learned new things, renewed old acquaintances and forged new business relationships.

DTS offered a powerful schedule covering from Fashion and Healthcare to IoT and FinTech and more for an audience ranging from storytellers to investors to coders, and was combined with workshops and a platform for startups as well (more...)

Install and configure Oracle HTTP Server Standalone

Last week I had an assignment to install and configure Oracle HTTP Server as a reversed proxy in a DMZ. Many years ago I worked with Apache a little, so I had not have the details at hand.

Although installing and configuring the HTTP Server is not hard, I found that I had to do quite some searching around to get to all the essentials. To help you out, and to have it logged for (more...)

RIP Oracle Connect

Blogging and long-form content seem so tedious to me nowadays, but if you’ve read here for a while, you’ll recall that I used to post several times a week.

One of the reasons I’ve kept this blog running in the era of ever-shorter content is that it keeps a historical record of this team’s work and our thoughts. As an emerging technologies team, we know that not all our thinking will make it to daylight, so we (more...)

Kafka Streams and NodeJS – Consuming and periodically reporting in Node.JS on the results from a Kafka Streams streaming analytics application

In several previous articles on Apache Kafka, Kafka Streams and Node.JS for interacting with Apache Kafka, I have described how to create a Node.JS application that publishes messages to a Kafka Topic (based on entries in a CSV file), how to create a simple Kafka Streams Java application that processes such messages from that Topic and how to extend that Java application to produce a running Top-N aggregation from that Topic. In this (more...)

Connecting Oracle Management Cloud with Oracle Enterprise Manager 13c

Let’s clear about this: Oracle Management Cloud (OMC) is NOT a replacement of Oracle Enterprise Manager Cloud Control (OEM CC) or even an equivalant. Rumours are that this will  be Oracle’s policy in a far away future, but in the meantime we focus on what they do best. OEM CC is a product for a complete management solution for your Oracle environment, OMC for monitoring and, most of all,  analyse the monitored data in your (more...)

Apache Kafka Streams – Running Top-N Aggregation grouped by Dimension – from and to Kafka Topic

This article explains how to implement a streaming analytics application using Kafka Streams that performs a running Top N analysis on a Kafka Topic and produces the results to another Kafka Topic. Visualized, this looks like this:

image

Two previous articles are relevant as reference:

Getting Started with Kafka Streams – building a streaming analytics Java application against a Kafka Topic

Kafka Streams is a light weight Java library for creating advanced streaming applications on top of Apache Kafka Topics. Kafka Streams provides easy to use constructs that allow quick and almost declarative composition by Java developers of streaming pipelines that do running aggregates, real time filtering, time windows, joining of streams. Results from the streaming analysis can easily be published to Kafka Topic or to external destinations. Despite the close integration with Kafka and the (more...)

Workshop Apache Kafka – presentation and hands on labs for getting started

The AMIS SIG session on Apache Kafka (9th February 2017) took 25 participants by the hand on a tour of Apache Kafka. Through presentations, demonstrations and a hands-on workshop, we provided a feet-hitting-the-ground-running introduction to Apache Kafka and Kafka Streams as bonus. Responsible for this workshop are Maarten Smeets and Lucas Jellema.

All materials for the workshop are freely available. The sources and hands-on lab are available in a GitHub Repo: https://github.com/MaartenSmeets/kafka-workshop 

The (more...)

Download all directly and indirectly required JAR files using Maven install dependency:copy-dependencies

My challenge is simple: I am creating a small Java application – single class with main method – that has many direct and indirect dependencies. In order to run my simple class locally I need to:

  • code the Java Class
  • compile the class
  • run the class

In order to compile the class, all directly referenced classes from supporting libraries should be available. To run the class, all indirectly invoked classes should also be available. That (more...)

NodeJS – Publish messages to Apache Kafka Topic with random delays to generate sample events based on records in CSV file

In a recent article I described how to implement a simple Node.JS program that reads and processes records from a delimiter separated file. That is  stepping stone on the way to my real goal: publish a load of messages on a Kafka Topic, based on records in a file, and semi-randomly spread over time.

In this article I will use the stepping stone and extend it:

  • read all records from CSV file into a (more...)

NodeJS – reading and processing a delimiter separated file (csv)

Frequently, there is a need to read data from a file, process it and route it onwards. In my case, the objective was to produce messages on a Kafka Topic. However, regardless of the objective, the basic steps of reading the file and processing its contents are required often. In this article I show the very basic steps with Node.js and and the Node module csv-parse.

1. npm init process-csv

Enter a small number (more...)

Oracle Service Bus: Produce messages to a Kafka topic

Oracle Service Bus is a powerful tool to provide features like transformation, throttling, virtualization of messages coming from different sources. There is a (recently opensourced!) Kafka transport available for Oracle Service Bus (see here). Oracle Service Bus can thus be used to do all kinds of interesting things to messages coming from Kafka topics. You can then produce altered messages to other Kafka topics and create a decoupled processing chain. In this (more...)