[ale] Organizing data

Chris Fowler cfowler at outpostsentinel.com
Wed Apr 28 00:34:51 EDT 2004


The problem is that the first row has an identical row after it with the
exception of user id and email.  That is where the extra stuff comes
from.  The unique key that I will reference is the alarm_id and so there
are 5 rows of the same alarm_id.  In the alarm table there is only one
alarm_id but when you start grabing data from all over the database that
has relationships then you end up with more than one row for a single
alarm  in the example at http://www.linxdev.com/out.html I need to
condense the first 5 rows of alarm 4 into a single data structure.  Then
The same for 5, 6, 7, etc...  

The tree looks like this

alarm -> location_id -> notification_group_id -> 
                                   escalation_id -> user_id
                                                    user_id
                                                    ... 
                                   escalation_id -> user_id
                                                    user_id
                                                    ...

alarm -> location_id -> notification_group_id -> 
                                   escalation_id -> user_id
                                                    user_id
                                                    ... 
                                   escalation_id -> user_id
                                                    user_id
                                                    ...

Then of course it repeats.  

In the database there are many mapping tables.  There is 
on mapping table that will map a user into an escalation_group.
There there is a mapping table where we map an escalation_group into
a larger group called a notification group.  With the mapping tables
there is no way of knowing how many users are in an escalation_group and
how many escalation_groups are in a notification_group.  Location_id and
nottification_group_id is a 1 to 1 relationship.  There is no mapping 
for them (yet).



On Tue, 2004-04-27 at 22:38, Jason Etheridge wrote:
> Chris Fowler wrote:
> > Would this be the best way to group that data?  I'm trying to do it in
> > as few cycles as possible because I have no idea how big that report can
> > get.
> 
> So you can't just pull the data out of the database whenever you want?
> 
> And when you do, you want to move this data into perl as quickly as 
> possible and worry about access speed later?
> 
> If using DBI, I'd be tempted to just dump the results into a file with 
> something like:
> 
> 	$rows = $sth->dump_results($maxlen, $lsep, $fsep, $fh);
> 
> But I don't know how efficient that is, and the man page doesn't 
> recommend it for data transfer applications.  Hrmph.
> 
> fetchrow_arrayref is supposed to be the fastest access method, but you 
> can't keep the reference so I don't see what it buys over 
> fetchrow_array.  If you want to store everything as delimited strings:
> 
> 	while ( @results = $sth->fetchrow_array ) {
> 		$userid = pop @results;
> 		$email = pop @results;
> 		push @{ $user_data{$userid} }, join("|", at results);
> 		if (! defined $user_email{$userid}) {
> 			$user_email{$userid} = $email;
> 		}
> 	}
> 
> Probably faster just to keep pushing arrays rather than concatenate into 
> strings.
> 
> 		push @{ $users{$userid} }, @results;
> 
> But then retrieval is a little more complicated.
> 
> Is this the type of stuff you're looking for?  I don't know perl's 
> object oriented stuff yet.
> 
> -- Jason
> _______________________________________________
> Ale mailing list
> Ale at ale.org
> http://www.ale.org/mailman/listinfo/ale



More information about the Ale mailing list