Friday, July 23, 2010

PROGRESS!, well sort of....

Even though I still don't understand much of the papers and will continue to wade through them to understand optimum experimental design (involving completely new ideas like Shannon information and Utility functions), I did manage to make some progress on my Monte Carlo code yesterday. I read in Ford 2008 that he just used a Jeffrey's prior P(x)=1/x, since this represents maximum ignorance about the scaling of a parameter. So I adopted a Jeffry's prior to period and velocity semi-amplitude. I added a uniform distribution for eccentricity, though I want to figure out something better than that. I do not know if a Jeffrey's prior would be right for eccentricity since it is not a scaling.

I have finished (other than the concerns expressed above) writing a "Survey" program and all the things inside it, and am working on the actual body of the Monte Carlo code. That is the progress part of yesterday. I have multiple problems that I am trying to figure out in this. First, the neighbor function/ getting each new point.

Time presents the worst problems. The way I have it right now, time gets changed with the number of observations. Number of observations goes down, dates get randomly eliminated. Number of observations goes up, dates get randomly added. I don't like the connection between optimizing both parameters. Also, it doesn't factor in not being able to observe certain objects for parts of the year. And its the same for all stars. And I have no safeties in place to protect from getting two of the same times into the time vector. And I have no constraints on having time in each semester.

Second problem is noise. Firstly, its dependent on what i do for time above. Second, when I add more stars, I'll have to include dimmer stars, and the average exposure time goes up for the same signal to noise ratio. Do I need to include that and I how do I do so efficiently? Another problem, I do not know how to translate from signal to noise ratio to instrumental uncertainty in m/s.

In the overall category, do I just change one parameter at a time, or all every iteration?

Even though I know the overall theory of the method, I don't actually know how to accomplish specific parts. For example, how should I change the probability acceptance parameter with number of iterations? Does the probability acceptance depend linearly on difference in planets between two points?

Also, do I need to be concerned about efficiency? Running one survey of 100 stars takes me about a minute to run. If I run ten thousand steps in the MC method, that not only includes running the survey each but calculating each parameter to input into the survey, that is a very long time. I don't understand why a survey is an entire minute. Probably stupid FOR loop that I can't eliminate.

Though I am definitely challenged on this problem, I feel like I am learning a lot and gaining an extremely valuable skill. Back to paper reading :)

Survey Update
Two ninth magnitude 3 solar mass stars in NGC2458 have been added. Five others in the same cluster were eliminated for one reason or another.

No comments:

Post a Comment