Browse Author


Technology expert, voider of warranties, mad scientist.

Kscope13 Recap

As with previous years, this year’s KScope conference did not disappoint!  The location, people, and content were all great.

While the weather in New Orleans was hot, the close proximity to many local attractions was cool and made it easy to explore the town between sessions.  The Sheraton was but steps to local dining, shopping, the historic French Quarter, and the Mississippi.  If you wanted to travel farther out, you simply had to walk out the front door and hop on a trolley car.

The record breaking drives of worldwide people who participated at the conference were great!  The organizers, venue staff, ODTUG Board Members, presenters, vendors, and attendees were all welcoming, helpful, and focused on making the conference successful.  The quality people that attend this conference make the conference special.

Overall, the content was exceptional.  Attendees had their pick of high quality content in all of the regular areas and even a few new areas such as .NET programming!  Oracle was on-hand to discuss the latest and greatest releases, consultants shared some of their secrets and best practices, vendors demonstrated their products, and employees provided real world case studies.

A few of my personal favorite sessions were:

* – Biased as this was MY session!

Speaking of sessions, I once again fielded a session focusing on API programming in EPM.  Unlike the 2012 session which took a shotgun blast approach to all sorts of EPM APIs, this year’s session shined a laser on the HFM COM API.  The goal of the session was to walk the attendees through an entire application build from scratch.

We covered installing the necessary HFM files to your computer, installing a development environment (Visual Studio), creating your first project, adding the necessary DLL file references, implementing key API calls, and compiling/executing your first program.  To further help the attendees get started, I provided a complete sample project that users could use as a starting point.

In case you missed my session(s) or you want a copy of the presentation deck, you can find them here:

2013 Session :

2012 Session :

NOTE:  Links to source code are included in the files at the end of the slide decks.

Hope to see you at Kscope14 in Seattle!


Mardi Gras World
Mardi Gras World
NOLA Trolley Car
Trolley Car
What would happen if .....
What would happen if …..


HFM Subcube Architecture Documents (Pre-

Recently a saw a colleague asking for some information about the HFM table layout and how they could interface with the data in the system.  While I recalled there were a few documents on the subject, I could not locate them via Google.  While I’ve provided them at the bottom of this blog post, I thought I’d give a 10 second high level explanation of the primary tables in regards to the actual HFM data.  (Pre

Generally speaking, when you configured HFM, you provided a database for use by the system.  This database will hold the data for all of your applications created in the envirionment.  To keep the data separated, each application’s data tables will be prefixed with the application name.  (i.e. SampleApp_CSE_11_2011)

Oracle is kind enough to publish a data model breakdown for most of the tables which can be found here:

Interestingly, they do not cover all of the tables as they appear to skip over the data tables.  When it comes to the data, there are 3 primary tables that hold the financial information : DCE, DCN, and DCT tables.  The DCE tables store <Entity Currency> and <Parent Currency> values and the related adjustments. The DCN tables store data for the remaining members of the value dimension. The DCT tables store the journal transactions – when the journals are posted, the data values flow to the DCE and/or DCN tables.

When viewing your database, you will see a series of tables that follow a relatively obvious naming convention:

Elevator Trivia Note: The fact that the Period and Year and part of the naming convention is a key reason as to why you specify an ending year (and number of periods) during the App build process!

Looking at the structure of these tables, you’ll see the following fields used consistently:


With the exception of lValue, these are pretty straightforward and act as foreign keys back to the respective tables such as <AppName>_Entity_Item.

lValue – Is slightly different as it takes a little work to determine what this is.  For each currency in your system, you will have Base Currency “Entity Currency”, Adjustments, and Total.  This field is a combination of that AND the currency.  The following formula will help you translate:

<AppName>_CURRENCIES.ItemID * 3 + 15 = “Currency Total”
<AppName>_CURRENCIES.ItemID * 3 + 16 = “Currency Adj”
<AppName>_CURRENCIES.ItemID * 3 + 17 = “Currency”

In addition to those fields, the tables all contain one or more Timestamp field.  Minor conversion is needed on the Timestamp field to make it human readable.  For instance in SQL Server you would return it as:

cast(TimeStamp2 as smalldatetime)

After these columns, each of the 3 table sets have slightly different layouts; however, one additional pattern you should see is that there will be incrementing columns that correlate to the number of periods in your application.  One of the columns will hold your actual data while the other one will hold a timestamp or some sort of indicator about the data (i.e. Trans Type)

Armed with this knowledge, one thing you can do is search all of your data tables to find orphaned information.  While Oracle gives you a way to clear it out, it doesn’t give you a good way to see what you are clearing.  With a basic query, we can not only find out if we have orphaned records, but what the data is and when it was created/last used/updated!

The following Microsoft SQL Query will dynamically go through ALL tables that reference lEntity and then determine undefined values.
NOTE: You should theoretically search for Account, ICP, and Custom orphans as well, but this is a start!

— Get List of Tables that refer to Entity ID and then search them all looking for undefined values

— NOTE: You would probably do the same for lValue, lAccount, lICP, lCustom1 – lCustom4,

— NOTE: lParent would correspond to entity table like lEntity

— NOTE: lSrc and lDes should also be checked against actual table. i.e. lSrcAccount checked against Accounts


–Initialize some working memory

Declare @tblSearchTable varchar(50)

Declare @tblSrcTable varchar(50)

Declare @tblSearchField varchar(50)

Declare @sql nvarchar(4000)

— Setup defaults

set @tblSearchField =‘lEntity’

set @tblSrcTable =‘<APPNAME>_entity_item’

–Setup Cursor

Declare tmp_cursor CURSOR

For Select distinct name from sys.tables where object_id in ( select object_id from sys.columns where name= @tblSearchField) order by name asc

Open tmp_cursor

–Retrieve Table name to process

Fetchnextfrom tmp_cursor

INTO @tblSearchTable



SET @sql=‘select ”’+ @tblSearchTable +”’ as SrcTable, * from [‘+ @tblSearchTable +‘] where ‘+ @tblSearchField +‘ not in (select itemid from [‘+ @tblSrcTable +‘])’

EXECsp_executesql @sql

–Retrieve Table name to process

Fetchnextfrom tmp_cursor

INTO @tblSearchTable


CLOSE tmp_cursor

DEALLOCATE tmp_cursor


Additional Reading

Hyperion Subcube Architecture Document

HFM Tuning Guide

Kscope13 is approaching!

With Kscope13 around the corner, I thought I’d share last year’s presentations:

For Kscope13, I’m presenting a follow-up to my API session which will see us complete a .NET desktop app from scratch to consuming web services via a tablet app.  Should be interesting for those looking to expand their product’s functionality via the APIs.

To see a full break down for Kscope 13, please go here :

See you there!

Enumerating Financial Reports via Workspace database

Disclaimer : This post will talk about direct database access. Do not perform unless you are sure you know what you are doing and I’m not liable! I’m just passing along information I’ve learned in case it helps you out….
Recently, we decided that we wanted to review all of our reports to determine which ones we could drop and which ones we should keep. Over the past 7 years, things have accumulated and we literally have thousands of reports gasp between all of the backups, Work In Process, and flat out duplication. In order to handle the review of this we wanted to assign prople to review specific sections of the reports. I was asked to provide a nice excel ‘report’ that would break down the report / folder hierarchy so that we could assign it.
That is when I realized that I really couldn’t locate a tool / report that would generate such a report listing!
In the old 7.2.5 version of Reports, it was relatively easy to get this information as all of the report information (including actual report content) was stored in a single database table.
With System 9 and Workspace, this is no longer true. With Workspace, the files are stored on the server’s file system in a manner consistent with the hierarchy of your reports/folders. Intially I thought I would be able to use this file system to build my report using simple DOS commands. (i.e. dir /s > reportlist.txt) Unfortunately though, the file names of the reports and folders are not ‘human readable’. (In order to decipher it, you’d have to go into the workspace database and x-ref these names to a table, etc)  Fortunately; however, you can build a report listing via SQL. The script below recursively builds a report with the following output :

(NOTE: Actually the script includes more columns of output; however, for display purposes here, I’m only showing the name and description)

Report Name     Description
AnnualReports NULL
===Actg-YContractual Obligations – Multiple Entities Contractual Obligations
===Annual Validation Summary Annual Validation Summary
===Customers and End User Markets NULL
====== 001 – End Use Markets Book End Use Markets Book
====== 002 – Top Customers Book Top Customers Book
====== 010 – Customers Book Reports – Corp NULL
========= Revenue by Top Customers – All Sectors – Pr Yr Comp Customer Revenues
========= Revenue by Top Customers – All Sectors – Pr Yr Comp -TEST Customer Revenues
====== 020 – Markets Book Reports – Corp NULL
========= Revenue by End User Markets – All Sectors End Use Markets
====== 030 – All Sectors – End Use Markets – Pr Yr Comp NULL
========= End Use Markets – Pr Yr Comp Customer Revenues


SQL 2005+ Query
USE BIPLus_prod;





— Anchor member definition



WHERE d.PARENT_FOLDER_UUID = ‘0000011b42b29e00-0000-0b4d-c0a8e22a’ UNION ALL

— Recursive member definition

SELECT d1.PARENT_FOLDER_UUID, d1.CONTAINER_UUID, d1.NAME, d1.DESCRIPTION, Level + 1, cast(Sort + ‘|’ + d1.Name as nvarchar(1024)), d1.CREATION_DATE, d1.LAST_MODIFIED_DATE




— Statement that executes the CTE


How this works :
The table V8_CONTAINER holds the main part of the report / folder entry data. In this table you will find Names, Creation Dates, Modified Dates, Owner name, UUID, and Parent UUID information. In order to build a hierarchy, you will need to pay careful attention to the UUID fields. CONTAINER_UUID is the internal name of the object in Workspace. (i.e. non-user friendly name) PARENT_FOLDER_UUID is a reference back to the item’s parent.
With this in mind, if you know the UUID of your starting point, you can simply recurse from there to get a listing of all folders and subsequent reports. In the script above I start off looking for an entry that is the starting point of our reports that we wanted to audit. For your purposes you will need to change this to what your starting point is.

NOTE : If you want to get everything in the table, use REPORTMART. REPORTMART is the parent to everything in the table.
I hope my walk through wasn’t terribly boring and helpful. I apologize for only providing a script that works in SQL Server; however, that is what we use at our site and there should be enough explanation here for someone relatively proficient with Oracle to write a similar script if they desire.

Synchronizing FDM (9.3.1) Applications w/ T-SQL

At my current location we have 1 main and 2 supporting FDM (9.3.1) applications. The main app is used to load production G&L data, and the supporting apps are used for loading our Budget and Forecast data and all are separated to ensure that a mistake in a multi-load template is minimized.
The problem we have with this setup is that we have to do 3x the maintenance work when it comes to users and locations.

While the users and locations are the exact same in each application, there are some differences; however: a.) Data Maps in Budget / Forecast are simply * to * as we use multi-load templates that have hte actual HFM account/entity names in them whereas production maps account for each company’s GL b.) Categories are different as we limit each FDM app to only the exact Category (Scenarios) required. c.) Periods are different as we limit each FDM app to only the needed periods required.

The solution for me was to implement a SQL direct copy routine for the information I need to move. This helps as I can : a.) automate this task b.) copy only what I want c.) Perform the task relatively quickly.

While the Workbench offers the ability to import / export components, I ran into trouble with it not correctly importing all of the data and it also restrcted my flexibility. Copying the entire database was not viable either as there were pieces I did not want to update….
The script below illustrates how to perform a copy via SQL. *bold*Please note that if you want to use this (or base a script off of this) that there’s absolutely no warranty and its use at your own risk.bold I’m fairly comfortable that this works fine (as I’ve tested this in our Dev), but you should test it on your own as well.

Also note, that this script does not copy ALL tables as I didn’t need them all for my purpose.

The script performs the following :

– Clears data from Target Database – Data Archives – Data Maps – Data
– Logs – Import Groups/Items – Validation Groups/Entities – Users / Partition Security – Partition Hierarchies / Links – Partitions (locations)
– Copies data from Source Database: – Import Groups / Items – Validation Groups / Items – Partitions (locations) – Partition Hierarchy / Links – Users / Partition Security
– Updates Parent Location setting on all Data Load locations (This is since I want all of my data load locations in the Budget / Forecast apps to use a * to * map. I added a new location in the Prod Database which is defined with the * to * mappings and when I copy everything over, I want the locations to use this map instead of their production G&L map.


-- FDM Database 'Auto Copy'
-- Author : Charles Beyer
-- Date : 7/1/2010
-- Description : This script is used to automatically / manually sync up database between a SOURCE FDM application and a 
--                DESTINATION FDM application.
-- ##TARGETDB## - Replace this with the name of the Database that you want to SYNC 
-- ##SOURCEDB## - Replace this with the name of the Database that is the data source

-- disable referential integrity

--Clear User Security
TRUNCATE TABLE ##TARGETDB##.dbo.tSecUserPartition
TRUNCATE TABLE ##TARGETDB##.dbo.tStructPartitionLinks
delete from ##TARGETDB##.dbo.tStructPartitionHierarchy

--Attempt to clear out the tDataMapSeg tables
EXEC sp_MSForEachTable '
  DECLARE @TableName VarChar(100)
  Set @TableName = PARSENAME(''?'',1)
  IF  left(@TableName,8) = ''tDataMap''

--Attempt to clear ou the tDataSeg tables
EXEC sp_MSForEachTable '
  DECLARE @TableName VarChar(100)
  Set @TableName = PARSENAME(''?'',1)
  IF  left(@TableName,8) = ''tDataSeg''

delete from ##TARGETDB##.dbo.tSecUser

delete from ##TARGETDB##.dbo.tPOVPartition

delete from ##TARGETDB##.dbo.tBhvValEntGroup

delete from ##TARGETDB##.dbo.tBhvImpGroup

-- RECOPY Data from Prod DB to Budget DB
insert into ##TARGETDB##.dbo.tBhvImpGroup
   select * from ##SOURCEDB##.dbo.tBhvImpGroup

insert into ##TARGETDB##.dbo.tBhvValEntGroup
   select * from ##SOURCEDB##.dbo.tBhvValEntGroup

insert into ##TARGETDB##.dbo.tBhvValEntItem
   select * from ##SOURCEDB##.dbo.tBhvValEntItem

insert into ##TARGETDB##.dbo.tBhvImpItemFile
   select * from ##SOURCEDB##.dbo.tBhvImpItemFile

insert into ##TARGETDB##.dbo.tPOVPartition
   select * from ##SOURCEDB##.dbo.tPOVPartition

insert into ##	TARGETDB##.dbo.tStructPartitionHierarchy
   select * from ##SOURCEDB##.dbo.tStructPartitionHierarchy

insert into ##TARGETDB##.dbo.tStructPartitionLinks
   select * from ##SOURCEDB##.dbo.tStructPartitionLinks

insert into ##TARGETDB##.dbo.tSecUser
   select * from ##SOURCEDB##.dbo.tSecUser

insert into ##TARGETDB##.dbo.tSecUserPartition
   select * from ##SOURCEDB##.dbo.tSecUserPartition

--Attempt to import data
insert into ##TARGETDB##.dbo.tDataMap (PartitionKey, DimName, SrcKey, SrcDesc, TargKey, WhereClauseType, WhereClauseValue, 
      ChangeSign, Sequence, DataKey, VBScript)
   select PartitionKey, DimName, SrcKey, SrcDesc, TargKey, WhereClauseType, WhereClauseValue, ChangeSign, Sequence, DataKey, 
       VBScript from ##SOURCEDB##.dbo.tDataMap

-- Update parent locations ...
update ##TARGETDB##.dbo.tPOVPartition
  set PartParent = 'BudgetTemplateLoc'
   PartName  'BudgetTemplateLoc' and PartControlsType = 1

-- enable referential integrity again