For instance. Say I have a square wave of 9 volts created by a 555 timer at the frequency 1 kilohertz, and that is the input into a transformer designed to step up the voltage to 120 volts. Will the frequency be changed in anyway? Are there any changes besides voltage?

There will also be some distortion of that square wave, since (of course) the transformer is an impedence. (Frequency won't change. High harmonics will.)

A typical 9v battery is anecdotally reported to be able to put out 3.5A -- VERY briefly -- if short-circuited. That means the absolute maximum amps you'd be able to get out of this setup if we ignored would be 3.5*9/120 -- call it .25A -- again, VERY briefly before the battery starts dying and the voltage drops off. At lower draws, use the fact that the battery's probably good for about 550 mAh at 9V -- sloppily scaling again, that means 41.25 mAh, which means you could draw 0.04125 amps for one hour; draw more and time goes down proportionally (until you start overheating the battery), draw less and it'd last longer.

Note that I'm also I'm ignoring RMS issues for AC. Technically we should probably reduce all those amperage numbers by around 30%. That is still extremely sloppy math, and I doubt you could get close to even that reduced number but I'm just trying to give a flavor of what this setup could and couldn't -- mostly couldn't -- do.

Sine there aren't many 120V devices that will run usefully on a quarter amp -- never mind four hundredths of an amp -- which wouldn't be better powered by converting the 9V directly to the voltage they actually want, this doesn't seem a particularly useful idea.

(OK, folks -- it's been a while since I've tried to approximate power conversion efficiencies -- feel free to throw rocks at my numbers and show how I shoulda computed it. I think the point remains that the pocket-sized 120V source would be is pretty close to useless.)

As for the usefulness of the power source. I wasn't actually thinking of a 9 volt battery, but I see why you thought that was what I was thinking of. The 9 volt was just an example. The project I had in mind was going to use much more power than that, both amperage and voltage, as it involves plasma generation. I am still unsure of power needed, just sure I needed a high frequency.

Longer explanation involves the fourier expansion of a square wave as an infinite series of summed sine waves. As the highest frequencies are filtered out (the transformer unavoidably acts as a R/L lowpass filter), the accuracy with which the output mimics the input decreases (and power fails to get through the transformer). If you lose all the higher frequencies, your square wave eventually becomes a sine wave (when only the base frequency is still present) and then, as the cutoff comes past it, goes away entirely.

I appreciate the fourier explanation. It's usually more important toknow why something is true then to just get a simplified answer. Again,much appreciated.