The package goes to a remote FTP server and pulls in a list of files, it iterates through those files and pulls them locally. From there a C# script runs and validates the data, pulling certain information into a variable which is used to provide runtime information and populate a table. Once the information is captured a process pulls the information into a temporary table. This is done for each file with all of them going into different tables. From there the data is run through another process which does a merge and transform of the data to eliminate duplication and does a fuzzy lookup to help prevent data that is meant to be the same. With that done it uses a slowly changing dimension to merge the information with an existing table and then uses that table to updated some cubes. Complex enough?