The Rittman Mead Global Services team have recently been involved in a number of security architecture implementations and produced a security model which meets a diverse set of requirements. Using our experience and standards we have been able to deliver a robust model that addresses the common questions we routinely receive around security, such as :
“Whats considerations do I need to make when exposing Oracle BI to the outside world?”
“How can I make a flexible security (more...)
The other week I posted a three-part series (part 1, part 2 and part 3) on going beyond MapReduce for Hadoop-based ETL, where I looked at a typical Apache Pig dataflow-style ETL process and showed how Apache Tez and Apache Spark can potentially make these processes run faster and make better use of in-memory processing. I picked Pig as a data processing environment as the multi-step data transformations creates translate into lots (more...)
I’m very pleased to announce that the Call for Papers for the Rittman Mead BI Forum 2015 is now open, with abstract submissions open to January 18th 2015. As in previous years the BI Forum will run over consecutive weeks in Brighton, UK and Atlanta, GA, with the provisional dates and venues as below:
- Brighton, UK : Hotel Seattle, Brighton, UK : May 6th – 8th 2015
- Atlanta, GA : Renaissance Atlanta Midtown Hotel, Atlanta, USA (more...)
At Rittman Mead R&D, we have the privilege of solving some of our clients’ most challenging data problems. We recently built a set of customized data products that leverage the power of Oracle and Cloudera platforms and wanted to share some of the fun we’ve had in creating unique user experiences. We’ve been thinking about how we can lean on our efforts to help make the holidays even more special for the extended Rittman Mead (more...)
If you are downloading the EA1 of SQL Developer that includes Oracle Data Miner (ODMr), and you intend to use Oracle Data Miner then you will need to update the ODMr Repository.
You could do it the hard way and run the upgrade repository sql scripts that are located in the ...sqldeveloper-188.8.131.52.29-no-jresqldeveloperdataminerscripts directory.
Or you could do it the easy way and let the inbuilt functionality in Oracle Data Miner do (more...)
In this mini-series of blog posts I’m taking a look at a few very useful tools that can make your life as the sysadmin of a cluster of Linux machines. This may be a Hadoop cluster, or just a plain simple set of ‘normal’ machines on which you want to run the same commands and monitoring.
First we looked at using SSH keys for intra-machine authorisation, which is a pre-requisite executing the same command (more...)
These days Hortonworks with their IPO and Cloudera sitting on $1bn of cash grab all the headlines. However,the real visionary in the field is someone else. Someone blasting the previous world record in TeraSort . A Hadoop distribution on both Amazon Web Services and the Google Compute Engine. A company that Google is invested in. While their competitors have been in skirmishes with each other, MapR has been quietly working away and innovating.
MapR-FS: Features and (more...)
A few days ago the first Early Adaptor release of SQL Developer 4.1 (EA1) was made available. You can go ahead and download it from here and make sure to check out the blog post by Jeff Smith on some install and setup that is required around the latest version of Java.
I've been using SQL Developer since its very first release, so getting my hands on a new release is very exciting. There (more...)
We had a couple of issues reported with the output of the Next Generation Outline Extractor where the exported file did not work properly as a load file. After some investigation, we found that the file encoding was incorrect. We were encoding the files using the Unicode/UTF-8 format. We chose this encoding so that we could write all characters in all languages, but we did not consider that UTF-8 is only valid for loading Unicode (more...)
In this series of blog posts I’m taking a look at a few very useful tools that can make your life as the sysadmin of a cluster of Linux machines easier. This may be a Hadoop cluster, or just a plain simple set of ‘normal’ machines on which you want to run the same commands and monitoring.
Previously we looked at using SSH keys for intra-machine authorisation, which is a pre-requisite what we’ll look (more...)
In this short series of blog posts I’m going to take a look at a few very useful tools that can make your life as the sysadmin of a cluster of Linux machines. This may be a Hadoop cluster, or just a plain simple set of ‘normal’ machines on which you want to run the same commands and monitoring.
To start with, we’re going to use the ever-awesome ssh keys to manage security on the (more...)
In the first two posts in this three part series on going beyond MapReduce for Hadoop ETL, we looked at why MapReduce and Hadoop 1.0 was only really suitable for batch processing, and how the new Apache Tez framework enabled by Apache YARN on the Hadoop 2.0 platform can be swapped-in for MapReduce to improve the performance of existing Pig and Hive scripts. Today though in the final post I want to take (more...)
Boy, that escalated quickly. I mean, my path to becoming an Oracle ACE since joining Rittman Mead!
When I began with Rittman Mead back in March 2012, I wasn’t planning on joining the ACE program. In fact, I really wasn’t sure what an Oracle ACE even was. If you’re not familiar, the Oracle ACE program “highlights excellence within the global Oracle community by recognizing individuals who have demonstrated both technical proficiency and strong credentials as (more...)
Last week I posted a blog that I had passed my TOGAF 9 and ArchiMate 2 Certification Exams. Find below the two presentations I made for myself to help me study. It’s certainly not enough information to pass both exams, but it might help to get the bigger picture. TOGAF 9 ArchiMate 2 I hope these…Read more Preparing for TOGAF 9 and ArchiMate 2 Exams
In the first post in this three part series on going beyond MapReduce for Hadoop ETL, I looked at how a typical Apache Pig script gets compiled into a series of MapReduce jobs, and those MapReduce jobs pass data between themselves by writing intermediate resultsets to disk (HDFS, the Hadoop cluster file system). As a reminder, here’s the Pig script we’re working with:
raw_logs = LOAD '/user/mrittman/rm_logs' USING TextLoader AS (line:chararray);
The UKOUG annual conferences commence on Sunday 7th December and run until Wednesday 10th.
Like previous years there are two conferences, one called TECH15 and the other is called APPS15. You might guess what each conference is about!!.
This year these conferences are being held at the same time and in the same venue. But they are separate conferences!.
This year I've been very lucky (or very unlucky) to have 3 presentations at these conferences. (more...)
The world is changing. People are asking for new kinds of rapidly changing data. This has a big impact on Business Intelligence and Information Management as we have known it for a long time. Therefore I look with great interest in Information Management Architectures in general and Oracle BI Solution Architectures in particular. Information Management Architectures…Read more TOGAF 9 and ArchiMate 2 Certified
How has the interest in Big Data, Hadoop, Business Intelligence, Analytics and Dashboards changed over the years?
One easy way to gauge the interest is to measure how much news is generated for the related term and Google Trends allows you do that very easily.
After plugging all of the above terms in Google trends and further analysis leads to the following visualizations.
Aggregating the results by year
It is very amazing to see (more...)
Quarterly breakup of units sold by manufacturer
View the interactive visualizations