Here are the slides of my yesterday’s OakTableWorld presentation. They also include a few hints about what our hot new venture Gluent is doing (although bigger annoucements come later this year).
Also, if you are at Oracle OpenWorld right now, my other presentation about SQL Monitoring in 12c is tomorrow at 3pm in Moscone South 103. See you there!
NB! After a 1.5 year break, this year’s only Advanced (more...)
Oracle OpenWorld is just around the corner – I will have one presentation at OOW this year and another at the independent OTW event:
Connecting Oracle with Hadoop
Real-Time SQL Monitoring in Oracle Database 12c
- Conference: OpenWorld
- Time: Wednesday, 28 Oct, 3:00pm
- Location: Moscone South 103
- Abstract: Click here
I plan to hang out at the OTW venue on Monday and Tuesday, so see you (more...)
I last wrote about Couchbase in November, 2012, around the time of Couchbase 2.0. One of the many new features I mentioned then was secondary indexing. Ravi Mayuram just checked in to tell me about Couchbase 4.0. One of the important new features he mentioned was what I think he said was Couchbase’s “first version” of secondary indexing. Obviously, I’m confused.
Now that you’re duly warned, let me remind you of aspects of (more...)
Gluent – where I’m a cofounder & CEO – is hiring awesome developers and (big data) infrastructure specialists in US and UK!
We are still in stealth mode, so won’t be detailing publicly what exactly we are doing ;-)
However, it is evident that the modern data platforms (for example Hadoop) with their scalability, affordability-at-scale and freedom to use many different processing engines on open data formats are turning enterprise IT upside down.
This shift has already been going (more...)
A few months ago I was asked to give a two hours lecture to a group of CIOs. The topic was a bit vague – “Introduction to Big Data and NoSQL” but I agreed to give it a try anyway.
Since I feel Big Data is such a big topic and since I really wanted to give the CIO so added value, I created this presentation. The aim of the presentation wasn’t to cover all (more...)
Read this article on my new blog
A very common use case when working with Hadoop is to store and query simple files (CSV, TSV, ...); then to get better performance and efficient storage convert these files into more efficient format, for example Apache Parquet.
Apache Parquet is a columnar storage format available to any project in the Hadoop ecosystem. Apache Parquet has the following
1. There are multiple ways in which analytics is inherently modular. For example:
- Business intelligence tools can reasonably be viewed as application development tools. But the “applications” may be developed one report at a time.
- The point of a predictive modeling exercise may be to develop a single scoring function that is then integrated into a pre-existing operational application.
- Conversely, a recommendation-driven website may be developed a few pages — and hence also a few (more...)
We will be presenting the Sonra Hadoop Quick Start Appliance at CeBIT next week in Hanover. Meet and greet us in Hall 2, Stand D52 (C58).
At Sonra we understand the difficulties faced by businesses when they begin their Big Data journey. We help you get started in days or weeks and immediately reap the benefits of Big Data. Sonra have packaged optimised Hadoop Supermicro hardware with MapR, the prime Hadoop distribution, and added our (more...)
We got excellent feedback for our first Hadoop User Group Ireland meetup. We wined, dined, and entertained more than 100 Hadoopers (and there was even beer left at the end of the night).
If you want to find out more about Sonra’s Hadoop Data Warehouse Quick Starter Solutions you can contact me or connect with me on LinkedIn.
For those of you who missed the event I have posted some pictures below. We have recorded (more...)
In this blog post we look at how we can address a shortcoming in the Hive ALTER TABLE statement using parameters and variables in the Hive CLI (Hive 0.13 was used).
There’s a simple way to query Hive parameter values directly from CLI
You simply execute (without specifying the value to be set):
You may use those parameters directly in (more...)
Join MapR and Sonra for the Hadoop User Group Ireland Meetup on 23 February at 6 pm at the Wayra offices (O2/Three building). You’ll learn more about the MapR distribution for Apache Hadoop through use cases, case studies and an introduction to the benefits of using the MapR platform.
Come by for this content-packed first event ending with the opportunity to socialise over beer and pizza kindly provided by Sonra.
What is (more...)
If you want to upskill and get certified on Hadoop you can now do so for free. Thanks to MapR. Over the next couple of weeks they are rolling out their on-demand Hadoop training courses. The highlight of the first batch of courses is Developing Hadoop Applications on Yarn.
Data isn't really respected in businesses, you can see that because unlike other corporate assets there is rarely a decent corporate catalog that shows what exists and who has it. In the vast majority of companies there is more effort and automation put into tracking laptops than there is into cataloging and curating information.
Historically we've sort of been able to get away with this
Over six parts I've gone through a bit of a journey on what Big Data Security is all about.
Securing Big Data is about layers
Use the power of Big Data to secure Big Data
How maths and machine learning helps
Why its how you alert that matters
Why Information Security is part of Information Governance
Classifying Risk and the importance of Meta-Data
The fundamental point here is that
So now your Information Governance groups consider Information Security to be important you have to then think about how they should be classifying the risk. Now there are docs out there on some of these which talk about frameworks. British Columbia's government has one for instance that talks about High, Medium and Low risk, but for me that really misses the point and over simplifies the
What does your security team look like today?
Or the IT equivalent, "the folks that say no". The point is that in most companies information security isn't actually something that is considered important. How do I know this? Well because basically most IT Security teams are the equivalent of the nightclub bouncers, they aren't the people who own the club, they aren't as important as the
In the first three parts of this I talked about how Securing Big Data is about layers, and then about how you need to use the power of Big Data to secure Big Data, then how maths and machine learning helps to identify what is reasonable and was is anomalous.
The Target Credit Card hack highlights this problem. Alerts were made, lights did flash. The problem was that so many lights flashed and
In the first two parts of this I talked about how Securing Big Data is about layers, and then about how you need to use the power of Big Data to secure Big Data. The next part is "what do you do with all that data?". This is where Machine Learning and Mathematics comes in, in other words its about how you use Big Data analytics to secure Big Data.
What you want (more...)
In the first part of Securing Big Data I talked about the two different types of security. The traditional IT and ACL security that needs to be done to match traditional solutions with an RDBMS but that is pretty much where those systems stop in terms of security which means they don't address the real threats out there, which are to do with cyber attacks and social engineering. An ACL is only