Readme file to use this most of what is in this directory! Main goal: Compute the likelihood to separate e from pi0 using a given set of variables. Structure: I am using the DASH framework to do this analysis. The following subroutine are to set the DASH framework: modDeclare.cc readntuple.cc readzbs.cc writentuple.cc writezbs.cc It also require c++-includes/ and mcguts.h ntread_atmpd-for.h ntread_atmpd.h Everything else is my own stuff. Overview: First I have to create historgrams of the variables I want to use in my likelihood, for signal and background. I need thos histograms to make lookup tables to compute the likelihood. (subroutine ftables.F and kumac t2k-tables.kumac) To make text files containing the tables I use: ring.kumac pi0mass.kumac efrac.kumac pid.kumac xalong.kumac cosnue.kumac cosopen.kumac deltal.kumac totpe.kumac And I can choose to apply efrac,totope,xalong, and cosopen with variable.txt (final set is efrac,cosopen and xalong) Second I actually compute the likelihood. (subroutine likelihood_7.F and kumac t2k-like.kumac) In both cases I need to split my event into energy bins and apply a set of cuts (FCFV, single ring, elike, no decaye). I also compute the reconstructed energy. (subroutine precuts.F) When I run on T2K MC I also need to normalize properly the nu_mu compare to the nu_e. Since I already applied a FCFV cuts on my ntuples, I need to use the offcial ones in order to compute the normalization (normalize.F and t2k-like-off.kumac) In the normalization I also need the nu_mu survival probablity (numusurvival.F) and the nu_e appearance (prob.F. I also have some program to compute the nu_e signal out of the nu_e beam: sim_nuesig.F and norm-esig.kumac