aboutsummaryrefslogtreecommitdiffstats
path: root/python
diff options
context:
space:
mode:
Diffstat (limited to 'python')
-rw-r--r--python/Load Intex Data.ipynb207
1 files changed, 0 insertions, 207 deletions
diff --git a/python/Load Intex Data.ipynb b/python/Load Intex Data.ipynb
deleted file mode 100644
index 0aa32a72..00000000
--- a/python/Load Intex Data.ipynb
+++ /dev/null
@@ -1,207 +0,0 @@
-{
- "metadata": {
- "name": "Load Intex Data"
- },
- "nbformat": 3,
- "nbformat_minor": 0,
- "worksheets": [
- {
- "cells": [
- {
- "cell_type": "heading",
- "level": 2,
- "metadata": {},
- "source": [
- "Preamble"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "In everything that follows the paths are relative from\n",
- "our directory ``//WDSENTINEL/share/CorpCDOs`` on Windows and\n",
- "``/home/share/CorpCDOs`` on Linux."
- ]
- },
- {
- "cell_type": "heading",
- "level": 2,
- "metadata": {},
- "source": [
- "Load data from intex and populate the database"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "we need to fill up three tables:\n",
- "\n",
- "* clo_universe\n",
- "* cusip_universe\n",
- "* et_collateral\n",
- "\n",
- "We first create a directory called ``data/Indicative_%Y-%m-%d``.\n",
- "From the spreadsheet, this is driven by the two shortcuts\n",
- "``CTRL+Shift+Y`` and ``CTRL+Shift+G``.\n",
- "\n",
- "* First select the cusips of interest, do ``CTRL+Shift+Y`` and paste them\n",
- " into a new portfolio tab in intex. On the portfolio tab itself,\n",
- " right click, then Export to text, and save into ``data/Indicative_%Y-%m-%d``.\n",
- " The intex filename should be called ``Portfolio.txt`` or ``Portfolio(i).txt``\n",
- " for some ``i``.\n",
- "* Then in the Collat Detail tab, make sure Asset Detail is\n",
- " toggled. Right click, then Export to text, All bonds and extract the\n",
- " resulting zip file into a folder named ``data/Collaterals_%Y-%m-%d``\n",
- "* Last step, select the cusips again, do ``CTRL+SHIFT+G``, paste into a\n",
- " new portfolio tab. Click on All Tranches, Right Click, Export to\n",
- " text, and save the file into a folder named ``data/Indicative_%Y-%m-%d``.\n",
- " The file name should be called ``Portfolio_TrInfo.txt`` or\n",
- " ``Portfolio_TrInfo(i).txt`` for some ``i``.\n",
- "\n",
- "We're done exporting the data from intex, now we need to load it into\n",
- "the database. We need to run 2 scripts. To do so, after connecting to\n",
- "the remote server, at the prompt do:\n",
- "\n",
- "$$c = \\sqrt{a^2 + b^2}$$\n"
- ]
- },
- {
- "cell_type": "code",
- "collapsed": false,
- "input": [
- "%run load_indicative.py"
- ],
- "language": "python",
- "metadata": {},
- "outputs": [
- {
- "ename": "InterfaceError",
- "evalue": "connection already closed",
- "output_type": "pyerr",
- "traceback": [
- "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m\n\u001b[0;31mInterfaceError\u001b[0m Traceback (most recent call last)",
- "\u001b[0;32m/usr/lib/python3/dist-packages/IPython/utils/py3compat.py\u001b[0m in \u001b[0;36mexecfile\u001b[0;34m(fname, glob, loc)\u001b[0m\n\u001b[1;32m 74\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mexecfile\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mfname\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mglob\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mloc\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mNone\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 75\u001b[0m \u001b[0mloc\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mloc\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0;34m(\u001b[0m\u001b[0mloc\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32melse\u001b[0m \u001b[0mglob\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 76\u001b[0;31m \u001b[0mexec\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcompile\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mopen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mfname\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m'rb'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mread\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mfname\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m'exec'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mglob\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mloc\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 77\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 78\u001b[0m \u001b[0;31m# Refactor print statements in doctests.\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
- "\u001b[0;32m/home/share/CorpCDOs/code/python/load_indicative.py\u001b[0m in \u001b[0;36m<module>\u001b[0;34m()\u001b[0m\n\u001b[1;32m 164\u001b[0m \u001b[0;31m#first load deal data\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 165\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mdeal\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mdeal_files\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 166\u001b[0;31m \u001b[0mupload_deal_data\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mdeal\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 167\u001b[0m \u001b[0;31m#then load tranche data\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 168\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mcusip\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mcusip_files\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
- "\u001b[0;32m/home/share/CorpCDOs/code/python/db.py\u001b[0m in \u001b[0;36mwith_connection_\u001b[0;34m(*args, **kwargs)\u001b[0m\n\u001b[1;32m 12\u001b[0m \u001b[0;32mexcept\u001b[0m \u001b[0mException\u001b[0m \u001b[0;32mas\u001b[0m \u001b[0me\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 13\u001b[0m \u001b[0mprint\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0me\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 14\u001b[0;31m \u001b[0mconn\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mrollback\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 15\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 16\u001b[0m \u001b[0;32mreturn\u001b[0m \u001b[0mrv\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
- "\u001b[0;31mInterfaceError\u001b[0m: connection already closed"
- ]
- },
- {
- "output_type": "stream",
- "stream": "stdout",
- "text": [
- "connection already closed\n",
- "connection already closed\n"
- ]
- }
- ],
- "prompt_number": 12
- },
- {
- "cell_type": "code",
- "collapsed": false,
- "input": [
- "%run load_intex_collateral.py"
- ],
- "language": "python",
- "metadata": {},
- "outputs": [
- {
- "output_type": "stream",
- "stream": "stdout",
- "text": [
- "done\n"
- ]
- }
- ],
- "prompt_number": 11
- },
- {
- "cell_type": "heading",
- "level": 2,
- "metadata": {},
- "source": [
- "Steps to build the model"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "* We need to have a monitor running that will trigger the scenarios\n",
- " generations for us.\n",
- "\n",
- " cd /home/share/CorpCDOs/scripts\n",
- " bash monitor.sh\n",
- "\n",
- " **Note** need to set it up as a daemon so that it get automatically\n",
- " started at boot.\n",
- "* After that, scenarios generation can be triggered by selecting the\n",
- " cusips in the excel spreadsheet and using the ``CTRL+Shift+H``\n",
- " shortcut.\n",
- "* To disable Reinvestment put \"Y\" under \"DisableReinv\" column before\n",
- " using ``CTRL+Shift+H``.\n",
- "* htop to monitor server activities.\n",
- "* This will trigger three scripts: first it will build the survival\n",
- " curves: these are saved into ``Scenarios/Portfolios_%Y_%m_%d``,\n",
- " second it will generate the loss and prepay distributions and save\n",
- " them into ``Scenarios/Intex curves_%Y-%m%d/csv``, and finally\n",
- " generate the sss files that we can load into intex in\n",
- " ``Scenarios/Intex curves_%Y-%m%d/sss``.\n",
- "* You can look at the logs file in ``scripts/logs``\n",
- " to monitor the process."
- ]
- },
- {
- "cell_type": "heading",
- "level": 2,
- "metadata": {},
- "source": [
- "Running the scenarios in Intex"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "* First make sure that Intex look for the sss files in the right\n",
- " directory. Go into Tools, Options, Data Access and change it\n",
- " appropriately. (e.g. Z:\\CorpCDOs\\Scenarios\\Intex\n",
- " curves_2013-07-22\\sss)\n",
- "* In Intex Options, under Miscellaneous, Change the export settings to\n",
- " \"Tab delimited\"\n",
- "* Select the cusips of interest on the spreadsheet (make sure the\n",
- " corresponding sss files exist), do ``CTRL+Shift+I`` and paste them\n",
- " into a new portfolio tab in Intex... twice so that Intex actually\n",
- " paste the data.\n",
- "* Then go onto the Scenarios Tab, and click the cross next to Rate\n",
- " Update to make sure it reloads the Interest Rates (otherwise intex\n",
- " will run it with libor set at 0... not good)\n",
- "* Then go onto the Cashflows tab and wait... Once it's done, export\n",
- " the cashflows into ``Scenarios/Prices_%Y-%m%-%d``.\n",
- "* Finally, on the server do:\n",
- "\n",
- " cd /home/share/CorpCDOs/code/R\n",
- " Rscript load_cf.R\n",
- "\n",
- " This will load the cashflows and compute the deltas, and save it in\n",
- " the database.\n",
- "* Enjoy your work by typing ``CTRL+T`` on the spreadsheet and see how\n",
- " all the columns get filled. Yay!"
- ]
- },
- {
- "cell_type": "code",
- "collapsed": false,
- "input": [],
- "language": "python",
- "metadata": {},
- "outputs": []
- }
- ],
- "metadata": {}
- }
- ]
-} \ No newline at end of file