Back in September
, I was asked, and agreed, to become to Content Chair for "The Traditional" track at Kscope 13. Like I mentioned there, I had been involved for the past couple of years and it seemed like a natural fit. Plus, I get to play with some really fun people. If you are ready to take advantage of Early Bird Registration, go here
. (save $300)
Over the past few weeks we've finalized (mostly) the Sunday Symposium schedule. We're currently working on finalizing Hands-on-Labs (HOL).
Beginning last year, we've had the Oracle product teams running the Sunday Symposia. This (more...)
decided to save a script that cleans out a couple of tables for me.
Now I have a script, how do I run it in SQL Dev? In SQL*Plus, I would run it like @clean_tables
. Two things to note there, 1, I didn't have to put the extension on the file and b, I assumed SQL*Plus was running from the directory where my file was located. If I was running the script from a different directory, I would have to use either a relative path...or something, but I digress.
I wanted to be able to run my (more...)
(First off, sorry Mike, I'm hoping this will break my writer's block...)
On Friday I was asked to look at a report that wasn't returning all of the data. Sample:
Year/Month Total Sales Total Sales (YAGO)
01/31/2013 $1,000,000 $900,000
For reference, YAGO is "Year Ago."
Notice anything funny there?
Yeah, February is missing. The (OBIEE) report has a filter on Jan, Feb and Mar of 2013. But it wasn't showing up. I confirmed via manual SQL (hah!) that there was (YAGO) data in there for February. Any ideas?
I immediately suspected one of two (more...)
I had been working on trying to get a process to run for each file. I used the Get File Names
step followed by the Copy rows to result
step. I had placed this in front of my Text file input
step, which is where you define the file for further processing.
That method produced a stream (that's what it's called in PDI) with each and every file and each and every record in those files. If I were just loading that into a table, it would have worked. However, I was assigning a identifier to each file using a (more...)
aka Kettle, aka PDI.
I've recently taken on a data integration gig and I'll be using Pentaho Data Integrator (PDI). I've read about Pentaho for years but never got around to actually using it. I'm excited about the opportunity. What is PDI
...delivers powerful Extraction, Transformation and Loading (ETL) capabilities using an innovative, metadata-driven approach. With an intuitive, graphical, drag and drop design environment, and a proven, scalable, standards-based architecture, Pentaho Data Integration is increasingly the choice for organizations over traditional, proprietary ETL or data integration tools.
I'll be using the enterprise edition (EE), which is supported, similar to how (more...)