Last year I got a message from a customer that he had some problem with his notification adapter in SOA Suite. His description was: “I’m not receiving all the email that I expect”.
First we thought it has to be a blocked email address on Weblogic, which was not the problem. Second that pops up in my mind was that we were sending that much email that the SMTP server are blocking us, also that (more...)
When I was patching all of our agents on our linux nodes with the latest PSU (22.214.171.124.5), I encountered an error with one linux node.
PREREQ_NAME: Performer check
PREREQ_DESC: Check if current performer are the file owner of /apps/oracle/product/agent12c/core/126.96.36.199.0.
PREREQ_MESG: Current user tony from Normal Oracle Home Credentials is not the file owner of /apps/oracle/product/agent12c/core/188.8.131.52.0.
Somehow privilege delegation hasn’t been applied (more...)
There continues to be a disproportionate amount of hype around 'NoSQL' data stores. By disproportionate I mean 'completely and utterly out of scale with the actual problems of the vast majority of companies'. I wrote before about 'how NoSQL became more SQL'. The point I made there is now more apparent the more I work with companies on Big Data challenges.
There are three worlds of data
In our environment all machines are linked to an LDAP server for authentication (AD in this case). You need to logon to machines with your personal credentials and use sudo to logon to a system account. This is a common practice and easily implemented in EM Cloud Control through ‘named credentials’. In my case I have several system accounts I need to be able to ‘sudo’ to, to perform actions like patching. So for (more...)
This morning (for us on CET) brought a surprise from the developers over at Oracle Application Epress headquarters. The Early Adopters release for APEX 5.0 has been made available to the public.
On https://apexea.oracle.com you can request access to the environment. After approval you can start testing all the new features for the upcoming version of APEX.
There are many things to discover, but the most obvious one is the new Grid (more...)
Looks like it’s time to start planning for the IDM conference schedule. There are some great conferences planned and I need to figure out how to start budgeting for some of these. Let me know if I have missed any conferences that should be listed.
I've been pretty verbal about Java going down the wrong path and my view that what Java should do is start having a 'core' which is just the real basics of the VM and the language and then a few profiles which specify what needs to be loaded, with the rest coming in on-demand based on the requirements of a given project. The old 'it needs to have everything so the browser/desktop/etc' is just
Which came first Big Data or Fast Data? If you go from a hype perspective you'd be thinking Hadoop and Big Data are the first with in-memory and fast coming after it. The reality though is the other way around and comes from a simple question:
Where do you think all that Big Data came from?
When you look around at the massive Big Data sources out there, Facebook, Twitter, sensor data,
After nearly everyone of my friends is using whatsapp
, it was overdue to download this app.
Why is this app used by everyone? What's cool about this app?
- You have to do nothing after the download except this sms-verification
- No account creation
- No new password (or reuse another one < - li="" nooo="">
- No need for searching addresses
- You do not have to take care about your contact list
- Just start messaging
Why does this work? Whatsapp uses (more...)
The game Cluedo (or just plain Clue in North America) is about discovering which person committed the murder, in what room using what. What is amazing is that in IT we have the easiest game of Cluedo going and yet over and over again we murder the poor unfortunate business in the same way, then stand back and gasp 'I didn't know that would kill them'.
I talk about the EDW, the IT departments
So I wrote about why your Hadoop project will fail so I think its only right that I should follow up with some things that you can do to actually make the Big Data project you take on succeed. The first thing you need to do is stop trying to make 'Big Data' succeed and instead start focusing on how you educate the business on the value of information and then work out how to deliver new (more...)
IT is a communist state in many organisations, one that believes in rigid adherence to inflexible approaches despite clear indications that they inhibit growth and a central approach to planning that Mao and Stalin would have thought is taking things a little too far. This really doesn't make sense in the capitalistic world of business and the counter-revolution is well under way. Its
There has been a policy in integration that has stored up a really great challenge of data security, and by great I don't mean 'fantastic' I mean 'aw crap'. Its a policy that was done for the best of reasons and one that really will in future represent a growing challenge to Big Data and federated information.
The policy can be described as this:
Users authenticate with Apps, Apps
Ok so Hadoop is the bomb, Hadoop is the schizzle, Hadoop is here to solve world hunger and all problems. Now I've talked before about some of the challenges around Hadoop for enterprises but here are six reasons that Information Week is right when it says that Hadoop projects are going to fail more often than not.
1. Hadoop is a Java thing not a BI thing
The first is the most important
Today when trying different settings with Basic Authentication and SOA Suite, I wanted to from the embedded OWSM Policy Repository of JDeveloper to the one stored on the application server. In JDeveloper you can do that through preferences (Tools | Preferences). See this blog for more details.
Click on the App Server Connection option and choose an existing connection through the Connections drop-down or add a new one by clicking New.