Skip to content

Python module to populate Federal Reserve Economic Data (FRED) data via REST API

Notifications You must be signed in to change notification settings

kennylim/pyfred

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

pyfred

Architecture Diagram

PyFred Architecture Diagram

Directory Structure

  • bin (source code )
  • data (datafile downloaded from fred rest api)
  • archive (compressed old datafile)
  • report (reports)
  • doc (documentations)

Manual Installation

  1. Save and uncompress provided fred.zip file in your local postgres home directory

su - postgres cd ~/fred/bin

  1. Connect to database via psql to create fred schema, staging table observation table psql -d postgres -f sql/create_observation_schema.sql

  2. Execute python script to fetch series data from fred api and convert to csv datafile

python populate_csv.py -s UNRATE
python populate_csv.py -s UMCSENT
python populate_csv.py -s GDPC1
  1. Truncate Staging (skip this step if staging is empty. there are no prior load)
psql -d postgres -f sql/truncate_observation_staging.sql
  1. Load cvs data into staging table

Note: You will have to change the datafile directory path if it different from your environemnt.

psql -d postgres -c "COPY observation_staging
              (series,realtime_start, realtime_end, observation_date, value)
              FROM '/Users/postgres/fred/data/fred_series_UNRATE.csv'
              WITH (ENCODING 'utf-8', HEADER 1,FORMAT 'csv')";

psql -d postgres -c "COPY observation_staging
              (series,realtime_start, realtime_end, observation_date, value)
              FROM '/Users/postgres/fred/data/fred_series_UMCSENT.csv'
              WITH (ENCODING 'utf-8', HEADER 1,FORMAT 'csv')";

psql -d postgres -c "COPY observation_staging
              (series,realtime_start, realtime_end, observation_date, value)
              FROM '/Users/postgres/fred/data/fred_series_GDPC1.csv'
              WITH (ENCODING 'utf-8', HEADER 1,FORMAT 'csv')";
  1. Validate and flag staging table
psql -d postgres -f sql/validate_observation_staging.sql
  1. Transform data in staging and load into observation table
psql -d postgres -f sql/transform_and_load_observation_table.sql
  1. Generate average annual unemployment rate report to report directory
psql -d postgres -f sql/annual_unemployment_average.sql  -o ../report/annual_unemployment_average_report.txt

Quick Installation

  1. Save and extract fred tar file in user home directory
su - postgres
tar -xvf fred.tar
cd ~/fred/bin
  1. Connect to database via psql to create fred schema, staging table observation table
psql -d postgres -f sql/create_observation_schema.sql
  1. I wrote a simple automation script to do the following tasks:
  • Fetch json data from Fred Rest API
  • Convert into cvs datafile
  • Truncate staging table
  • Load cvs data into staging table
  • Transform and load data from staging to observation table

Note: You will have to verify and have to change the load datafile path to your environment if is diferent. The path is in line 22 from run.py script

   FROM '/Users/postgres/fred/data/fred_series_%s.csv'
python run.py
  1. Generate average annual unemployment rate report to report directory
psql -d postgres -f sql/annual_unemployment_average.sql  -o ../report/annual_unemployment_average_report.txt
  1. To schedule to run midnight everyday
crontab -e
0 00 * * * /Users/postgres/fred/bin/run.py

About

Python module to populate Federal Reserve Economic Data (FRED) data via REST API

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages