Tumgik
cgarling · 7 years
Text
Tucson -- Steward Observatory
Hello all -- figured I’d drop an update while my code runs.
I’m now employed by Steward Observatory at the University of Arizona as a research specialist, and will be working with LSST Deputy Directory Beth Willman and her near-field cosmology group for the next six months before I start grad school (where that will be is still TBD).
My first order of business is integrating some new data from the Bok Telescope at Kitt Peak Observatory into the analysis I did previously with Dark Energy Camera / CTIO Blanco 4m Telescope data. I am using Pyraf (IRAF...but in Python) to script the reduction process, and will be using DAOPHOT II to do point-spread-function photometry. Unfortunately I need to write the reduction script from scratch, as I needed no such thing for the DECam data. That’s what I’m currently working on, but it’s coming along very quickly and I anticipate beginning the photometry by the beginning of next week. After that I have to calibrate the photometry, then weave the new data into the old data in a (hopefully) seamless manner -- and look at the results.
Will update again when progress has been made. 
P.S. Tucson is great. Check out my office view (the mountains look way bigger in person)
Tumblr media
1 note · View note
cgarling · 7 years
Text
Last semester recap -- Done at Haverford
Hey all, back again. Last semester was my last one at Haverford College! I didn’t get much research done, but I did go to AAS 229 in Grapevine, Texas. I gave a talk, “Spectral Classification of Heavily Reddened Stars by CO Absorption Strength,” about the research I did at Colgate University last summer, and now I am working to wrap up the project and get the paper out. I’m trying to get it done by the end of the month, because on February 1 I fly out to Tucson to work with Beth Willman at the University of Arizona for the next 6 months, until I start graduate school (don’t know where I’m going yet). I about three weeks off after the end of last semester, and it was nice to finally have some free time. But it’s also good to get back into research, and it’ll be really good when this project is wrapped. I’ll be back with another post at some point while I’m in Tucson, and I’ll make the obligatory post when I decide where I’ll be going to graduate school.
1 note · View note
cgarling · 8 years
Text
Monday June 13, 2016
Today added a sum of two Gaussians (for blended lines)  as a function for my spectral line fitting code, as well as automated unit transformations of the wavelength axis.
0 notes
cgarling · 8 years
Text
Friday, June 10, 2016 - Week Recap
This week my team’s advisor Jeff Bary was away at the Cool Stars 19 meeting in Upsala, Sweden. In his absence, we had the freedom to explore our new project -- something I feel I took full advantage of.
One of the main goals Jeff gave us for the week was to get used to using TripleSpecTool, a GUI constructed to do reductions of images taken with the TriplesSpec spectrograph on the ARC 3.5m telescope at APO. On Friday of last week, I finished getting IDL set up on the lab computers so we were ready to work this week. 
I quickly felt comfortable with the GUI, which did many of the harder parts of spectrum reduction (from my experience with IRAF) automatically. I reduced the data for one night of prior TripleSpec data, and felt comfortable enough not to continue doing the same thing all week.
Instead, I transferred some of the reduced spectra to my laptop and began looking at how we should be analyzing the spectra in the coming weeks. We are investigating the reddening laws in dense cloud cores by taking spectra of bright stars behind the clouds. When the light from these stars passes through the dense cloud, the light is extincted, and reddened. To measure that reddening, we have to know what spectral type (OBAFGKM) and luminosity class (I-V) these background stars are. Thus, spectral classification is going to be a large part of this project.
After searching the literature, I found several papers focusing on stellar classification in the near infrared regime we will be studying, (Wallace 1997, Meyer 1998, Coelho 2005, Winge 2009) and noticed that they all report either continuum subtracted or continuum normalized spectra. I figured a good place to start was to look at determination of the stellar continuum (the overall shape of the spectral energy distribution in the region we are studying).
I read that polynomial fitting was common, so I coded it up in Python and met with less than stellar results. There seemed to be several regions of the spectrum that were not well fit, even with high (<15) degree polynomials. So I looked at using other methods, and thought to use one of my favorites: the spline.
At the heart, a spline is an interpolation technique for fitting the regions in between data points. But unlike polynomial interpolation, where the interpolating function is valid globally, spline is a piecewise function with a different polynomial fit between each of its “knots,” in a pure interpolation spline, each knot is a data point. 
However, there are also smoothing splines, where the number of knots is less than the number of data points. When N(knots) < N(data), the spline interpolation is rougher than it would be if all data points are used as knots: thus, the smoothing spline, as its name suggests, can produce a smoothed fit. I used a cubic spline, which is continuous up to its second derivative, with a constrained error of 0.8 Jansky. The algorithm adds knots up until the total error of the spline is less than the constrained error.
The results of this analysis are summarized in the plot below. I chose to subtract the continuum, but another valid technique is to divide the original spectrum by the continuum, which results in a normalised spectrum (average value 1, not 0).
Tumblr media
I also note the wavelength ranges where other studies overlap with ours in the box to the lower left.
After completing this, I still felt frustrated about the quality of the continuum fit. It still seems to me that there is a slight trend in the region 1.4-1.8 microns and in the CO overtone bands where the continuum was not fit perfectly, because of the strength of the absorption lines in these regions. Thus I looked on for another solution.
I understood that IRAF has splot, which does things similar to what I’m trying to do, but since updating the lab computers to OS X 10.11, it seems the IRAF installations broke, and after looking on iraf.net, it seems many others experienced this as well. Without admin access to fix the installation, I had to try something else.
I found a simple GUI program written in Python at http://python4esac.github.io/plotting/specnorm.html for spectrum normalisation, and I thought it was worthwhile to investigate.
The core functionality is simple but robust: a user marks several continuum spots over the spectrum, then a cubic spline is fit to those points and the spectrum is divided by that spline and replotted.
After looking at the code, I realized that the original author had set up the interactive part of the code for me, so that adding more functionality would be as easy as picking a key and writing code. So I got to it.
I altered the code to read in .fits files correctly, converted some units from angstroms to microns that caused bugs with our spectra, and, most importantly, implemented equivalent width fitting functionality.
As evidenced by a thread on astrobetter in which professional astronomers express frustration at the current tools available for analysis of spectral features, I decided none of the suggested tools looked particularly like what I wanted. So I started tinkering in the specnorm.py code and implemented spectral feature fitting of my own.
One of the quantities of these spectral features we are most interested is their equivalent width (see https://en.wikipedia.org/wiki/Equivalent_width ) which requires a robust integral of the spectral feature and an estimate of the continuum around the feature. 
I currently have equivalent width fitting implemented in two ways: after the user selects a point on either side of the feature, both a cubic spline and a gaussian profile are fit to the data in the user-defined region. The fits are integrated and the equivalent widths and the full width at half maximum of the gaussian are printed to the terminal.
I also experimented with MCMC in python (emcee package) this week, as I thought perhaps one way we might determine spectral types would be by fitting spectral classification templates to our data, and the best way to do that would probably be MCMC.
Overall I think it was a good week. I might see about improving the gaussian fitting of the code, but we’ll see what Jeff thinks of it next week.  I might also want to look into fitting Voigt profiles -- but that would have to be for another post.
0 notes
cgarling · 8 years
Text
hello again
I am back from junior year and have survived.
This summer I am working with Professor Jeff Bary at Colgate University and several other undergraduates supported by the Keck Northeast Astronomy Consortium’s REU, doing spectroscopic investigation of the reddening laws in dense molecular cloud cores.
I will start using this blog again, as a way to document my work for myself and my team, as well as for anyone else who happens to be following along.
Looking forward to a good summer ; )
0 notes
cgarling · 8 years
Text
End of semester update
Phew, just finished the last of my work for this semester.
Hopefully I didnt fail music. lol
Well, I’m attending the meeting of the American Astronomical Society in  Kissimmee, Florida from January 4 - 8. Got my train+plane tickets, hotel room, POSTER...everything is ready. Looking forward to some good down time till then. I’ll update everyone on how it went afterwards!
0 notes
cgarling · 9 years
Photo
Tumblr media
45 notes · View notes
cgarling · 9 years
Photo
Tumblr media
13 notes · View notes
cgarling · 9 years
Text
Back at it (for a bit)
After taking the last week off to rest and spend time with my family, I’m going to spend some time tonight working on the analysis of our Laevens 1 data.
A newly discovered object, there is dissension in the community about whether this object is a globular cluster or a dwarf galaxy. I am searching for RR Lyrae variable stars to look at the spatial extent of the object, which will hopefully tell us something about its interaction history with the Milky Way. The median period of the RR Lyrae variables can also be used to compare Laevens 1 with other similar, better-studied objects.
0 notes
cgarling · 9 years
Text
Last day at Texas Tech
Well, today will be my last day working at Texas Tech for the summer. I fly back to PA early tomorrow. It’s been a good summer -- I learned a lot and we’re getting close to having fantastic results. I am very thankful that I had the chance to work with a great professor and postdoc, and I feel good about having a perspective on astronomy outside of Haverford. Although it’s been great, I really am looking forward to going back home and taking a break. I’ve got some things I need to take care of before going back to school, and one of those things is definitely getting rest. I’ll continue doing some work from home, but mostly I’ll be trying to recuperate before the new school year begins. It’s been good!
0 notes
cgarling · 9 years
Text
LCOGT Telescope Proposal Accepted
We have been given the telescope time we requested and we will be taking our first new observations tonight. Wish us luck!
1 note · View note
cgarling · 9 years
Text
LCOGT Telescope Proposal Finalized
My advisor is happy with the telescope proposal I wrote to get more data for my current project and will be submitting it tomorrow morning. We are on the verge of getting Results for this project and we just need a little more data before we can draw real conclusions. 
Until we get new data I’ll be shifting my focus from the dwarf galaxy I was working on to a newly discovered object that has been characterized as a globular cluster by one group and a dwarf galaxy by another. There’s a big difference! I’ll be doing similar analysis to what I’m doing now, so I anticipate moving forward with the analysis fairly quickly.
4 notes · View notes
cgarling · 9 years
Text
Weekend update -- Python
Tumblr media
For most of this summer I’ve been focusing on just getting work done: so I was coding in IDL, my first language and generally my preference.
However, Python has some real advantages and is more versatile so I’ve been trying to mix in learning Python with doing science. 
This weekend I made some simple programs in Python, a GUI interface to SSHFS (mounting remote disks on local drives), a configuration file for John Thorstensen’s JSkyCalc (a wonderful tool for planning observing runs), and a startup script for iPython notebooks. I’ve also been working on converting my current IDL codes to Python and have made good progress.
Tumblr media Tumblr media Tumblr media
1 note · View note
cgarling · 9 years
Video
youtube
SymPy -- a free alternative to Mathematica
Integration with IPython notebooks allows for Mathematica-style interactive computation
Implementation as a Python module allows for differentiation, integration, and other mathematical techniques within Python programs
0 notes
cgarling · 9 years
Text
Mendeley Desktop
Something that becomes apparent rather quickly when beginning to get serious about academic research is that there is a desperate need to organize all the journal articles we read on a daily basis. 
Things become even more complicated when you’re preparing documents in LaTeX and find yourself having to cite 30+ papers. What’s the best workflow to integrate paper organization and citation?
The best process I have found is Mendeley Desktop + BibTex + LaTeX.
Note: This is not a tutorial on how to use BibTeX or LaTeX. I assume you already know how they work for this tutorial. If people would like a series where I show my favorite TeX processing workflow, I can do that, but this is focusing on how I organize papers for citation. 
The beginning is a Mendeley library with journal articles properly tagged, and folders where you’ll put the articles you want to cite.
Tumblr media
then you’ll go into your Mendeley settings and have it create BibTeX files for each folder you make.
Tumblr media
Now, for my herc_paper folder, I will have a bibtex file at /home/cgarling/Documents/Mendeley/herc_paper.bib
I can use this in my LaTeX document without moving it into my working directory--that is, I can reference the Mendeley directory. This way, TeX will always use the most up to date version of the bibtex file made my Mendeley. To make my bibliography, I simply have
\usepackage[round, sort]{natbib}
blah blah \cite{something}
\bibliography{/home/cgarling/Documents/Mendeley/herc_paper}
TeX will then use my up-to-date file from Mendeley, and my bibliography will be made. Easy! 
Tumblr media
(I am using emulateapj document class with apj bibliography style). 
0 notes
cgarling · 9 years
Text
How’s Texas Tech??
I’m glad you asked.
THE GOOD
The facilities here are very nice! I’m in the basement of the physics building and I have a lab to myself most of the time. The network is excellent and I haven’t had any trouble doing my work. Also there are MANY astronomers here all working on cool things and it’s really nice to just chat with them about  what they’re doing. We only have two astronomers at Haverford College so it’s much different at Tech.
THE BAD
When I’m outside my sickly pale skin feels like it’s being bitten by fire ants. 
Overall a great experience but I can’t wait to get back to Pennsylvania’s weather!
0 notes
cgarling · 9 years
Text
So I haven’t posted anything here since last summer. oops...
my sophomore year at Haverford College went well but in all honesty I didn’t get much research done. 
But this summer, I am stationed at Texas Tech, working with Dave Sands and Denija Crnojevic on campus and Beth Willman and Jonathan Hargis (from Haverford) remotely to continue my work on finding RR Lyrae stars in Milky Way dwarf galaxies.
Last summer we did a great job nailing down the parameters for our photometry of our astronomical images. Jonathan then applied these parameters and performed the photometry over the school year while I was busy being a student. That put me in a great position begin analysis this summer.
We began by checking that the photometry was of good quality -- this consisted of looking at over 2000 pages of quality assessment plots. (...you can skim them pretty fast though. They just have to have the right general shape.)
After getting preliminary confirmation that the photometry was a success, I set out to calibrate the data. 
Photometry (in our case with the DAOPHOT and ALLSTAR suite of instruments by Peter Stetson) returns instrumental magnitudes for stars that need to be calibrated onto a standard system. There are multiple ways to do this, but the galaxy we are studying falls in the footprint of the Sloan Digital Sky Survey (SDSS), so we simply use the stars that are in both our images and the SDSS catalog and compare the relative magnitudes in order to calibrate. Mathematically this becomes fairly techincal, I’ve written a document on it and if anyone is interested I would be happy to send it out. 
So this is where I began the summer!! 
1 note · View note