Sunday, September 02, 2012

Feynman diagrams in LaTeX

Sometimes you need Feynman diagrams in papers. It turns out there are quite a few ways to get there, but the handiest is FeynMP. I'm using, in particular, MikTeX and TeXMaker, but something like this should work for other setups. Here are the steps:

First, get the feynmf package from your package manager.



Next, add the following to your $\LaTeX$ preamble: I know the package name says feynmp; if you use that, you'll have a lot more errors. The feynmp package is nicer, and included when you install feynmf.

Now, let's say you want to add a diagram for Compton scattering (the point of the whole exercise). I suggest wrapping it in a $\LaTeX$ figure for convenience; doing so will lead to something like this: Note the name of the file is compton. In your $\LaTeX$ working directory, there will be a file called compton.mp. You need to run the mpost command on it, like so:


This generates compton.1. Now, compile your $\LaTeX$ (are you getting tired of the cute capitalization yet?) file again, and your diagram will appear!



Note that you'll have to do this anytime you edit the figure (e.g. add labels, etc.)

But if you think this is a pain, try the alternatives!

Here's a post that set me on the track: http://physical-thought.blogspot.com/2008/08/feynmf-feynman-diagrams-in-latex.html

Here's the details on getting your preamble correct: http://tex.stackexchange.com/questions/20241/how-to-use-kile-with-feynmf-or-feynmp

Here's a quick tutorial on using feyMP: http://suppiya.files.wordpress.com/2008/02/fmfsamples.pdf

Here's a more in-depth tutorial by Thorsten Ohl, the author of feynMP/feynMF: http://xml.web.cern.ch/XML/textproc/feynmf.html

Here's a quicker tutorial by Thorsten Ohl, https://docs.google.com/viewer?url=http://www-zeus.desy.de/~kind/latex/feynmf/fmfcnl3.ps&pli=1

And here's the actual manual: http://www.pd.infn.it/TeX/doc/latex/feynmf/manual.pdf

Also, while we're on the topic, this site had a lot of invaluable tips for formatting math symbols: http://www.math.uiuc.edu/~hildebr/tex/displays.html

I hope this shortens the learning curve for someone else! If you have any feedback or corrections, send notes or pull requests to the gists.

Happy $\LaTeX$-ing!

Wednesday, January 04, 2012

General Relativity flash-cards using Anki

Anki is a neat spaced repetition system that allows you to maximize memorization efficiency, or something like that. Interestingly, I came across it on the Clojure group, and there are already decks available for learning Clojure. It accepts LaTeX, so I've decided to make a flash-deck of some handy formulas that pop up in General Relativity, because there's enough to learn without forgetting!

(Also, it's handy to have LaTeX snippets someplace semi-permanent.)

Bianchi identity:

$\nabla_{[\lambda}R_{\rho\sigma]\mu\nu}=0$

Christoffel symbol:

$\Gamma_{\mu\nu}^{\lambda}=\frac{1}{2}g^{\lambda\sigma}\left(\partial_{\mu}g_{\nu\sigma}+\partial_{\nu}g_{\sigma\mu}-\partial_{\sigma}g_{\mu\nu}\right)$

Covariant derivative of a 1-form:

$\nabla_{\mu}\omega_{\nu}=\partial_{\mu}\omega_{\nu}-\Gamma_{\mu\nu}^{\lambda}\omega_{\lambda}$

Covariant derivative of a vector:

$\nabla_{\mu}V^{\nu}=\partial_{\mu}V^{\nu}+\Gamma_{\mu\lambda}^{\nu}V^{\lambda}$

Covariant form of Maxwell's equations:


$\partial_{\mu}F^{\nu\mu}=J^{\nu}$

$\partial_{[\mu}F_{\nu\lambda]}=0$

for

$J^{\nu}=\left(\rho,J^{x},J^{y},J^{z}\right)$

and

$F_{\mu\nu}=\left(\begin{array}{cccc}
0 & -E_{1} & -E_{2} & -E_{3}\\
E_{1} & 0 & B_{3} & -B_{2}\\
E_{2} & -B_{3} & 0 & B_{1}\\
E_{3} & B_{2} & -B_{1} & 0
\end{array}\right)$

Riemann tensor:

$R_{\sigma\mu\nu}^{\rho}=\partial_{\mu}\Gamma_{\nu\sigma}^{\rho}-\partial_{\nu}\Gamma_{\mu\sigma}^{\rho}+\Gamma_{\mu\lambda}^{\rho}\Gamma_{\nu\sigma}^{\lambda}-\Gamma_{\nu\lambda}^{\rho}\Gamma_{\mu\sigma}^{\lambda}$

Properties of the Riemann tensor:


$R_{\rho\sigma\mu\nu}=-R_{\sigma\rho\mu\nu}$

$R_{\rho\sigma\mu\nu}=-R_{\sigma\rho\nu\mu}$

$R_{\rho\sigma\mu\nu}=R_{\mu\nu\rho\sigma}$

$R_{\rho[\sigma\mu\nu]}=0$


Ricci tensor:

$R_{\mu\nu}=R_{\mu\lambda\nu}^{\lambda}$

Ricci scalar:

$R=R_{\mu}^{\mu}=g^{\mu\nu}R_{\mu\nu}$

Einstein tensor:

$G_{\mu\nu}=R_{\mu\nu}-\frac{1}{2}Rg_{\mu\nu}$

Formulae from Sean Carroll's Spacetime and Geometry: An Introduction to General Relativity

Anki synchronizes with DropBox, but it's a bit involved. When I get my deck synchronized and uploaded, I will post a link to it.

Update: https://ankiweb.net/shared/info/1777635479

Thursday, December 01, 2011

Lisp Conversion

A few months and a lot of Lisp later, I find myself convinced/converted ...

... To Clojure.

Rajesh, you were right!

As far as language snobbery coolness, it has a bunch of features I like such as:
To get an idea of what I mean, here's an anonymous function to find the odd numbers in a (lazy) sequence (which could be a list, vector, or hash map):


This idea of lazy sequences is powerful, because you can do things like get the 10,001st prime number without blowing the stack:

You can just see the number-crunchy goodness, mixed in with Lispy functional precision.

As far as practicality, there is simply too much awesome stuff.
You can easily do TDD (test driven development), which is really handy if, say, you've got a bunch of math functions that you want to be sure are correct when you port/rewrite code.

Here's a screenshot of IntelliJ with a typical Leiningen project open:



You can see the typical Leiningen project layout, with /src and /test folders and subfolders. First, we'll write a function test for a function we want which sums over all values in a given sequence:


The test is in C:\Projects\CDT\Newton\test\Newton.test\core.clj, and the :use [Newton.utilities] tells it to look in the file C:\Projects\CDT\Newton\src\Newton\utilities.clj for our function. Note the use of metadata ^{:utilities true} to mark this as a utilities test, which we'll use later for organization. Our test checks that our to-be-defined sum test sums correctly over both a list and a vector.

Now here's the contents of C:\Projects\CDT\Newton\src\Newton\utilities.clj:


Finally, Leiningen allows us to choose test selectors so that we can specify which tests we want to run via project.clj:


Now by running lein at a command prompt (to save startup time) we can pick our tests:



Note in the first case, we don't expect any tests to run (test! means fetch dependencies and then run tests) because our sole test has been marked as a :utility. In the second case, we tell it to run :utility and it does, telling us that our test passed. Success!

If our test had failed, clojure's test suite would give us good information. Here, I'm going to modify the second assertion to fail. Watch what happens:


How cool is that?

I've so far read Clojure in Action and The Joy of Clojure (both highly recommended), plus enough daily doses  to actually stop mucking about and start with the CDT code already.

So, a modern Lisp with powerful IDEs, modern libraries from the JVM, interactive REPL/TDD, great documentation, learning resources, and books -- what's not to like?

Tuesday, April 19, 2011

Reflection tools for F#

I went to the fabuluous CodeConf 2011 (view slides, recaps here, here, and here) and the first talk was "Tinker Fairy" Dr. Nic telling us to build tools to do stuff that we don't want to remember later. Then build tools to build those tools -- tool tools.

One of the neat modern takes on Lisp s-expressions in modern virtual machines like the CLR is Reflection. At least, I think that it will be useful in reversing Lisp macros and expressions into the F#/OCAML equivalents.

Dr. Jon Harrop gives a terse but informative example in his book Visual F# 2010 for Technical Computing.

First, we want a union type which represents (i.e. abstracts away) the F# type system:



Next, we want a (recursive) function (called, straightforwardly enough, type_of) that reflects (using FSharpType) and translates a given System.Type object into one of the 'a ty union types defined previously:



This then allows us to emit the following two liner which can parse objects such as the List.fold function! (Note: everything after the ;; is the F# Interactive response.)



Neat stuff! I've a thousand or two lines of Lisp to look at, so this is not something I want to have to remember later.

Tuesday, March 01, 2011

Software Archaeology

Vernor Vinge prophetically wrote of a time when programmer-archaeologists maintained the fabric of civilization by diving into and modifying legacy code which ran the systems that society depended upon.

Various other folks have picked up on this notion, from the serious to the humorous. Here, though, I'll talk about this from my own perspective (which is what you came here for, right?).

Kernighan's saw goes that debugging code is twice as hard as writing it; therefore we ought to keep our meaning clear and our code as simple as possible. How to do so?

There are clear debates about that: functional vs. declarative, procedural vs. object-oriented, not to mention Patterns & Anti-Patterns, Dependency Injection/Loose Coupling, Aspect-Oriented Programming, etc. etc. These can be very fun to get into and there are diverse and subtle points all around, that I won't attempt to do them justice here but if you've a free week or two read any of the above links and the next five references thereafter and you'll come away more enlightened, or more confused.

But in the meantime, you've either got to a) emit working code or b) manage those who do a). And if you could do so without too badly embarrassing yourself in the future (which is nigh impossible), or at least, be willing to chalk them up as learning experiences, you're well on your way to some sort of nirvana of ineffable, crystallized logic which is a perfect solution to your problem.

(Getting a clear problem statement itself being at least half of the battle and most of the difficulty, given business processes that aren't well understood, or mutate depending upon who's doing them or in which context. But that discussion more properly belongs in the realm of project management and business analysis, and won't be further remarked upon here.)

If you're not a coder yourself (or horribly out of date), you can still make a fair crack at judging the product by the team. The Mythical Man-Month is the canonical reference, but Joel Spolsky's Joel Test is pretty concise, descriptive, and useful.

Archaeology can imply adventurous, sunburned types digging around fossil layers high in vast dusty mesas of stratified rock. And truth be told, that's not a bad analogy for the cacophony of systems that the average IT organization has inherited, cobbled together, purchased (often from a now-defunct vendor), or perhaps in a fit of creativity -- produced. After all, post dot-com, Greenfield development is rare.

But Brownfield development is often so painful that most developers will throw up their hands and rewrite from scratch, rather than attempting to piece together the workings of an often poorly documented system written with "ancient" methods/languages.

Hence, onto the first item on the Joel Test: source control.

But not just any source control. GitHub.

Why GitHub? Well, first, it has the elusive "Alpha Coder mindshare". While it may not matter one way or another to your business that the Linux kernel, Git itself, jQuery, Ruby on Rails, and a host of other important projects exist on GitHub, it matters to your programmers, whether they know it or not (and the good ones will know it).

All of these actively maintained open source projects provide something more interesting than mindshare: examples. Pick a programming language, and you will very likely find an interesting project or two on GitHub that has something worth learning. It may even prove to be the Rosetta stone of programming languages -- you may find solutions to the same problem in many different programming languages.

Second, Social Coding. Everyone knows of the usefulness of social networks -- they existed before, but it's the tools that made them marketable/actionable. Social coding in GitHub takes the usual forms -- followers, blogs, wikis, issues, teams, organizations -- plus some more useful ones (e.g. the GitHub API).

We used Team Foundation Server. It was a nice tool in our .NET development shop -- a bit painful to setup with it's dependence on SharePoint, but useful. However, it didn't scale too well in terms of collaborators. We needed to add them as users into Active Directory, fuss about with SharePoint and licensing, and so forth.

So next we tried CodePlex. CodePlex was, essentially, TFS in the cloud, and it mostly worked. There were capacity issues, and it wasn't always friendly with non .NET languages, but the main reason we didn't adopt it wholesale was:
  1. No way to make private repositories
  2. Often painful to connect into
  3. Went down/was slow often enough that we didn't want to rely on it.
This really illustrates the third virtue of GitHub, that it's a true cloud service -- but cloud computing is all the hype right now and I really wanted to illustrate it's particular benefits in this instance.

In going with GitHub, we created an organization for our, well, organization. This gives us several important advantages over CodePlex:
  1. Private repositories
  2. Teams
  3. Unlimited collaborators (in particular, we can mix and match between general GitHub accounts and team members)
  4. Blogs, Wikis, Gists, Issue Trackers with voting, per-line file commenting, and other social features
  5. Works well with any programming language
  6. Fast, decentralized development (Git works locally, so you can get on a plane, code, and upload your changes once you've got internet access)
  7. Reliable versioning (Git uses hashes for files)
  8. Works well with any OS/IDE (Git has integration with Visual Studio, Eclipse, XCode plus command-line versions in most every OS)
  9. Git is a well-regarded distributed version control system (DCVS)
My programming team ported projects over from TFS and CodePlex in under a day. By following projects, I can watch check-ins, view version differences, open/close issues, and do all the usual software management stuff without getting in the way. (Or better yet, delegate.)

The fees are pretty nominal (organizations get charged based on the numbers of private repositories they want; public ones are free). GitHub is hosted by RackSpace, so the reliability has been better than our in-house TFS boxes. Today I just added someone outside our organization to one of our projects with minimal hassle.

If you're going to be digging up fossilized code, Git and GitHub are fairly pleasant tools for the job.

Look at the time! This isn't really everything I wanted to say, but I've probably said enough for now (and I have other pressing priorities including my own research), so I'll leave further pontificating for another time.

I hope this was informative, or at least, entertaining!

(You can find me on GitHub here!)

Friday, December 03, 2010

Socialism, Capitalism, Spending, oh my!

For a change of pace, I'm going to ruminate on the state of affairs as I see them.

(It's my blog, I get to exercise my First Amendment! ;-)

(Yes, I'm procrastinating. The cluster is down for maintenance, and I need a break from technical reading and writing.)

My Pop shared an interesting article during Thanksgiving, which noted that way back in the beginning of the direct founding of the U.S.A., colonists in the 1600's tried a form of socialism wherein the results of the season's harvest were shared equally amongst the colonists.

(The account was written by the governor at the time, direct citations anyone?)

The official noted that there were many deaths due to starvation, much inequity in terms of labor given vs. food received, and so forth. On the whole, the colony was in danger of perishing as the vicious cycle of famine made the remaining colonists weaker and less able to work, thus reducing the harvest, and so forth.

In these dire straits, the next thing they tried was that they divided the land up equally amongst the families instead. Each family was free to keep the results of their work, and if they had excess, sell it for profit.

The results were immediate: the next season's harvest was so bountiful that many families had excess, and those families that did not were able to buy from those that did. No one starved.

The official also notes the changes in motivation (paraphrasing): "Women who claimed infirmity and poor health when compelled to serve the community went willingly into their own fields with their children. The high (and unacceptable) costs that might have necessary to compel this behavior were no longer required ..."

Score one for capitalism, individual thrift and perseverance, and ....

But wait. Where did they get the land?

Oh, that's right. They were given it. Or, from another perspective, perhaps they stole it. (I don't want to get into those issues since there's even more controversy about that and it's incidental to my point.)

The point is: the colonists were given the tools to feed themselves, and they then were allowed to make their own way.

Capitalism is a fine system for efficient distribution of goods, services, and products. But it was an act of socialism (specifically, the land grant) that gave those first colonists the means to start on their new lives. (They had to supply the effort.)

We're no longer an agrarian society. We're an information one.

What are the tools needed to make our way today?

I'd argue for these, in descending order:

  1. Health
  2. Free flow of people, goods, services, and information
  3. Education
  4. Stability
I'd also argue that lack of any one of these items is problematic.

That last item may be even more difficult to quantify, except that you know it when you see it: wars, famines, natural disasters, stock market crashes, etc. Things too big for any individual or family to handle alone, something that requires a societal solution.

The world is chaotic. Trying to impose too much stability results in a dead/fragile/stratified society (see history for numerous examples). Too little, and no one can plan for the future.

We are a mixed socialist/capitalist society. Go towards any extreme for any of the above, and we will suffer.

Define suffering: again, you know it when you see it. Death, disease, famine, wasted lives, inability to meaningfully affect your own destiny, loss of freedoms, etc. These are broad brush strokes, individuals/society will naturally have their own values.

So how do we provide the above to our citizenry in our society? How do we give the tools to be successful, without redistributing the results of that success unfairly?

(This is not meant as an exhaustive analysis, but a mere framing of the problem.)

Government spending:

Now let me say from the outset, that like general relativity, I prefer solutions to be as localized as possible. I'd also like to remain as free as possible from any equations of constraint enforced by some larger entity. I don't dispute that some are necessary (anarchy is cool until you live in it), but I wonder if we are half as good at removing laws as we are at making them.

These entities need money to do their work. Right away that suggests a neat solution: don't give them any. Unfortunately, that only works if you already have the means to provide the 4 items mentioned above. Clearly, not all individuals do, so a society based strictly upon "to each their own" would be manifestly unfair, and I do retain the silly idea that the universe ought to be as fair as possible.

(Oh wait, we're already a plutocracy.)

Now spending itself is a moderately abstract formulation of reality: there are only so much goods, services, information, and time available.

Spend less than you make, and you have savings: for a rainy day, or to help someone else start something they couldn't otherwise do (perhaps with the hopes that you'll be compensated in the future for lending your resources today).

Spend more than you make (if you're allowed to), and you have debt, or future restrictions on your earning potential.

Pretty straightforward stuff.

It seems that Americans in general are now used to the consumption economy fueled by credit and debt, and we've passed that along to our government. Or perhaps it's the other way around.

That's a fairly common rant, and I'll not repeat it except to say: we've obviously hit the limit on how much debt (monetary, environmental, etc.) we've incurred, and we should be looking to pay it off instead of increase it.

This seems pretty simple to say and do, but let's watch how events unfold and see if the politicians actually realize that our government cannot continue to live beyond its means. At least the Fiscal Commission is a start, although of course there are economists who think it's not a problem after all.

I'm not an economist, but there seem to be some pretty instructive examples that indicate otherwise.

Okay, back to considering things I have somewhat of a clue about.




Tuesday, January 19, 2010

CDT rewrite toolbox

So, my colleagues have developed a CDT program that's usable. Fortunately for me, it's in LISP, which lacks parallel processing, modern libraries, a nice IDE, and the other goodies I've become accustomed to in my work life. (That means I get to figure these features out and thereby contribute!)

Enter Visual Studio 2008, IronPython, and IronScheme.

Setting up IronScheme with Visual Studio 2008 was usefully detailed here: (note, you need RegPkg via the Visual Studio 2008 SDK)

Setting up IronPython with Visual Studio 2008 via IronPython Studio (integrated setup):

And voila, no more excuses to complain about development.

(Yes, the end goal is to make it Python and cross-platform, although I'm really eying F#)