Posts Tagged ‘math’

July 12, 2012

1. Use mathematics as a shorthand language rather than as an engine of inquiry
2. Keep to them [your models/problems] till you have them done
3. Translate to english
4. Illustrate with examples important to real life
5. Burn the mathematics
6. If you can’t succeed in 4, burn 3

Alfred Marshall’s rules for using mathematics in economics.
(via fantasticphenomenal)

Gender Studies & Mathematics, #2

June 29, 2012

I’m totally convinced that ∃ more connections between mathematics and the humanities than the university culture I once stewed in would suggest.

Probably due to personality differences, but also lack of familiarity with each other’s subject matter, I never saw inter-departmental collaborations and—as I’ll discuss in another post—even the idea of data is seen as a four-letter word in the gender studies department. (Likewise, ethnography and anecdote are four-letter words within the economics field, and statisticians also concern themselves only with structured data.)

Nevertheless I see mathematical shapes all over cultural analysis, and I mean to record them. (However typing up a coherent few paragraphs, let alone adding drawings, takes several orders of magnitude more time than simply thinking a thought.)


After reading her essay crowing that millennials do not see themselves as special, I went on to read more of Phoenix and the Olive Branch, which talks about rehabilitation from “Quiverfull” fundamentalist upbringing—particularly gender issues that arose as a Quiverfull young woman.

(Relevant to the “value of liberal arts” question, Sierra writes that “College literally saved my life”—without the critical thinking skills—not science or programming skills—that she learned at college, her mind and heart and … uterus would have remained ensnared in the “Quiverfull” fundamentalist mindset she grew up in. Just an interesting sidelight.)


Sierra has a very logical way of describing a flaw with sexist views:

Check out this gem from “Reclaiming the Mind”:

You see, when people are truly committed and consistent egalitarians, they have to defend their denial of essential differences. In doing so, they will advocate a education system in the home, church, and society which neutralizes any assumption of differences between the sexes. In doing so, men will not be trained to be “men” since there is really no such thing. Women will not be encouraged to be “women” since there is no such thing. The assumption of differences becomes a way to oppress society and marginalize, in their estimation, one sex for the benefit of the other. Once we neutralize these differences, we will have neutered society and the family due to a denial of God’s design in favor of some misguided attempt to promote a form of equality that is neither possible nor beneficial to either sex.

As a truly committed and consistent egalitarian, yes, yes I do deny “essential” differences. You know why? My essential nature is not “woman.” My essential nature is me. Sierra. It’s who I am. …[M]y best friend[’s] essential nature [is] not identical to mine. It might have similar colors and shapes, but so would mine and my fiance’s. Because people are different. “Men” are not more different from women than they are from other men.

In statistical or mathematical language, I would interpret this as saying “The fact that gender==Woman is not entirely determinate of everything about me.”

If I were writing a computer program to mimic the kind of sexism Sierra is talking about, it would take one input for gender and, if the answer is male, then prompt for further details on the personality, achievements, background, interests, thoughts. Elsif gender == female, then the only questions worth asking are “Fat? Hot?” Otherwise, break; because there is no else.

Not that the “Being a minority is determinate of everything and only males can show variation” is limited to gender. On Reddit we find:

“I can’t imagine a black guy saying ‘anywho’”

as if blackness is somehow so determinate of behaviour. Charmed, I’m sure.


In statistics the paradigm is that data go into a model and a couple numbers come out. Some of the numbers parameterise the model. But other numbers tell us how good the explanation is. There are numbers to tell us how well individual parts fit, how well the overall whole fits, and several numbers that are warning indicators for various types of traps that can make the other numbers mess up.

Thinking that everything about a minority is determined by their minority status is a bit like ignoring all the model-fit numbers.

If we explored some data with a large number of linear models, progressing from coarse (few terms) to fine (many terms), we would probably see gender differences as a significant term among coarse models. But those models would also have a low specificity and explanatory power. Then as we added more explanatory terms (finer models), those other explanators—correlates of gender/race, but not gender/race itself—would start to steal explanatory power away from the gender  dummy variable.

To give a physical example, 100m sprint times show differences across male/female, but training is more determinate of the sprint time. If we could measure personality and thoughts and the kinds of traits that Sierra might say define her as a person, we would probably be left with very little t-value on the gender dummy.


One more mathematical parallel. The idea that “minorities show no variation; only the privileged group can be variable” is isomorphic to Jim Townsend’s mathematical-psychology model of racism. Substitute “minority” with “other group” and “privileged group” with “self” or “my group” and you have the same model of a negatively curved metric space:

negatively curved metric space for self versus other (privileged group versus minority group)

So you thought postmodernism was opposite to science? Here is Derrida’s “privileged hierarchy” where “one term dominates the other” — at least one mathematical interpretation of those words.

June 29, 2012

multiplicitiesoffreedom demonstrates Chaos Theory in Excel. If he filled in more initial values, you would see a thick bar—like a picture of white-noise.

a chaotic process (logistic map) generated & drawn in R
white (Gaussian) noise

  • Butterflies flapping their wings in Vermont to change the wind in Hangzhou?
  • A drop of water on Jeff Goldblum’s hand taking a very different path down depending on random parameters?
  • Or—as in multiplicitiesoffreedom’s picture—like a hashing function, the codomain being a highly-discrepant reordering|shuffle of the domain?

I found a paper on Chaos Theory as a metaphor for Institutional Economics and I just couldn’t help but play around with the equations inside. (Like the methodology of inst. econ)

For those who want to play around with the logistic map in R as well as Excel, do:

y = logisticSim()
plot(y, col=rgb(.1,.1,.1,.75) )

How do I Create the Identity Matrix in R? Also a bit of group theory.

June 27, 2012

I googled for this once upon a time and nothing came up. Hopefully this saves someone ten minutes of digging about in the documentation.

You make identity matrices with the keyword diag, and the number of dimensions in parentheses.

> diag(3)
     [,1] [,2] [,3]
[1,]    1 0 0
[2,]    0 1 0
[3,]    0 0 1 

That’s it.

> diag(11)
      [,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [,9] [,10] [,11]
 [1,]    1 0 0 0 0 0 0 0 0 0 0
 [2,]    0 1 0 0 0 0 0 0 0 0 0
 [3,]    0 0 1 0 0 0 0 0 0 0 0
 [4,]    0 0 0 1 0 0 0 0 0 0 0
 [5,]    0 0 0 0 1 0 0 0 0 0 0
 [6,]    0 0 0 0 0 1 0 0 0 0 0
 [7,]    0 0 0 0 0 0 1 0 0 0 0
 [8,]    0 0 0 0 0 0 0 1 0 0 0
 [9,]    0 0 0 0 0 0 0 0 1 0 0
[10,]    0 0 0 0 0 0 0 0 0 1 0
[11,]    0 0 0 0 0 0 0 0 0 0 1 

But while I have your attention, let’s do a couple mathematically interesting things with identity matrices.

First of all you may have heard of Tikhonov regularisation, or ridge regression. That’s a form of penalty to rule out overly complex statistical models. @benoithamelin explains on @johndcook’s blog that

  • Tikhonov regularisation is also a way of puffing air on a singular matrix det|M|=0 so as to make the matrix invertible without altering the eigenvalues too much.

Now how about a connection to group theory?

First take a 7-dimensional identity matrix, then rotate one of the rows off the top to the bottom row.

> diag(7)[ c(2:7,1), ]
     [,1] [,2] [,3] [,4] [,5] [,6] [,7]
[1,]    0 1 0 0 0 0 0
[2,]    0 0 1 0 0 0 0
[3,]    0 0 0 1 0 0 0
[4,]    0 0 0 0 1 0 0
[5,]    0 0 0 0 0 1 0
[6,]    0 0 0 0 0 0 1
[7,]    1 0 0 0 0 0 0 

Inside the brackets it’s [row,column]. So the concatenated c(2,3,4,5,6,7,1) become the new row numbers.


Let’s call this matrix M.7 (a valid name in R) and look at the multiples of it. Matrix multiplication in R is the %*% symbol, not the * symbol. (* does entry-by-entry multiplication, which is good for convolution but not for this.)

Look what happens when you multiply M.7 by itself: it starts to cascade.

> M.7   %*%   M.7
     [,1] [,2] [,3] [,4] [,5] [,6] [,7]
[1,]    0 0 1 0 0 0 0
[2,]    0 0 0 1 0 0 0
[3,]    0 0 0 0 1 0 0
[4,]    0 0 0 0 0 1 0
[5,]    0 0 0 0 0 0 1
[6,]    1 0 0 0 0 0 0
[7,]    0 1 0 0 0 0 0

> M.7 %*% M.7 %*% M.7 [,1] [,2] [,3] [,4] [,5] [,6] [,7] [1,] 0 0 0 1 0 0 0 [2,] 0 0 0 0 1 0 0 [3,] 0 0 0 0 0 1 0 [4,] 0 0 0 0 0 0 1 [5,] 1 0 0 0 0 0 0 [6,] 0 1 0 0 0 0 0 [7,] 0 0 1 0 0 0 0

If I wanted to do straight-up matrix powers rather than typing M %*% M %*% M %*% M %*% ... %*% M 131 times, I would need to require(expm) package and then the %^% operator for the power.

Here are some more powers of M.7:

> M.7   %^%   4
     [,1] [,2] [,3] [,4] [,5] [,6] [,7]
[1,]    0 0 0 0 1 0 0
[2,]    0 0 0 0 0 1 0
[3,]    0 0 0 0 0 0 1
[4,]    1    0    0    0    0    0    0
[5,]    0    1    0    0    0    0    0
[6,]    0    0    1    0    0    0    0
[7,]    0    0    0    1    0    0    0

> M.7 %^% 5 [,1] [,2] [,3] [,4] [,5] [,6] [,7] [1,] 0 0 0 0 0 1 0 [2,] 0 0 0 0 0 0 1 [3,] 1 0 0 0 0 0 0 [4,] 0 1 0 0 0 0 0 [5,] 0 0 1 0 0 0 0 [6,] 0 0 0 1 0 0 0 [7,] 0 0 0 0 1 0 0

> M.7 %^% 6 [,1] [,2] [,3] [,4] [,5] [,6] [,7] [1,] 0 0 0 0 0 0 1 [2,] 1 0 0 0 0 0 0 [3,] 0 1 0 0 0 0 0 [4,] 0 0 1 0 0 0 0 [5,] 0 0 0 1 0 0 0 [6,] 0 0 0 0 1 0 0 [7,] 0 0 0 0 0 1 0

> M.7 %^% 7 [,1] [,2] [,3] [,4] [,5] [,6] [,7] [1,] 1 0 0 0 0 0 0 [2,] 0 1 0 0 0 0 0 [3,] 0 0 1 0 0 0 0 [4,] 0 0 0 1 0 0 0 [5,] 0 0 0 0 1 0 0 [6,] 0 0 0 0 0 1 0 [7,] 0 0 0 0 0 0 1

Look at the last one! It’s the identity matrix! Back to square one!

Or should I say square zero. If you multiplied again you would go through the cycle again. Likewise if you multiplied intermediate matrices from midway through, you would still travel around within the cycle. It would be exponent rules thing^x × thing^y = thing^[x+y] modulo 7.

A picture of the cyclic group Z3 with three elements. No, I'm not going to draw another one with seven elements. You can draw that one.

What you’ve just discovered is the cyclic group P₇ (also sometimes called Z₇). The pair M.7, %*% is one way of presenting the only consistent multiplication table for 7 things. Another way of presenting the group is with the pair {0,1,2,3,4,5,6}, + mod 7 (that’s where it gets the name Z₇, because ℤ=the integers. A third way of presenting the cyclic 7-group, which we can also do in R:

> w <- complex( modulus=1, argument=2*pi/7 )
> w
[1] 0.6234898+0.7818315i
> w^2
[1] −0.2225209+0.9749279i
> w^3
[1] −0.9009689+0.4338837i
> w^4
[1] −0.9009689−0.4338837i
> w^5
[1] −0.2225209−0.9749279i
> w^6
[1] 0.6234898−0.7818315i
> w^7
[1] 1−0i

File:Cyclic group.svg

Whoa! All of a sudden at the 7th step we’re back to “1” again. (A different one, but “the unit element” nonetheless.)

So three different number systems

  • counting numbers;
  • matrix-blocks; and
  • a ring of imaginary numbers

— are all demonstrating the same underlying logic.

Although each is merely an idea with only a spiritual existence, these are the kinds of “logical atoms” that build up the theories we use to describe the actual world scientifically. (Counting = money, or demography, or forestry; matrix = classical mechanics, or video game visuals; imaginary numbers = electrical engineering, or quantum mechanics.)


Three different number systems but they’re all essentially the same thing, which is this idea of a “cycle-of-7”. The cycle-of-7, when combined with other simple groups (also in matrix format), might model a biological system like a metabolic pathway.

Philosophically, P₇ is interesting because numbers—these existential things that seem to be around whether we think about them or not—have naturally formed into this “circular” shape. When a concept comes out of mathematics it feels more authoritative, a deep fact about the logical structure of the universe, perhaps closer to the root of all the mysteries.

In the real world I’d expect various other processes to hook into P₇—like a noise matrix, or some other groups. Other fundamental units should combine with it; I’d expect to see P₇ instantiated by itself rarely.

Mathematically, P₇ is interesting because three totally different number systems (imaginary, counting, square-matrix) are shown to have one “root cause” which is the group concept.

John Rhodes got famous for arguing that everything, but EVERYTHING, is built up from a logical structure made from SNAGs, of which P₇=C₇=Z₇ is one. viz, algebraic engineering

Or, in the words of Olaf Sporns:

[S]imple elements organize into dynamic patterns … Very different systems can generate strikingly similar patterns—for example, the motions of particles in a fluid or gas and the coordinated movements of bacterial colonies, swarms of fish, flocks of birds, or crowds of commuters returning home from work. … While looking for ways to compute voltage and current flow in electrical networks, the physicist Gustav Kirchhoff represented these networks as graphs…. [His] contemporary, Arthur Cayley, applied graph theoretical concepts to … enumerating chemical isomers….

Graphs, then, can be converted into adjacency matrices by putting a 0 where there is no connection between a and b in the [row=a, column=b], or putting a (±)1 where there is a (directed) link between the two nodes. The sparse [0’s, 1’s] matrix M.7 above is a transition matrix of the cyclical C₇ picture: 1 → 2 → 3 → 4 → 5 …. A noun (C₇) converted into a verb (%*% M.7).

In short, groups are one of those things that make people think: Hey, man, maybe EVERYTHING is a matrix. I’m going to go meditate on that.

June 25, 2012

Discrete differential geometry

Check out page 40 of the source PDF (calcula ex geometrica) — they talk about geometrical computation. Instead of

  1. approximating analog with digital
  2. doing digital arithmetic
  3. smoothing the result to fake an analog again

why not OOP-define computer operations that deal directly with the curves via their defining properties?

Sort of like how MATLAB, Mathematica, Wolfram Alpha, YACAS, SAGE, Maxima, Axiom, Maude, PARI, lie, Singular, GAP — the symbolic calculators — do integrals and derivatives by the same rules that you would on paper rather than numerically approximating a function and then delivering only floating-point answers. The “computer as zillions of arithmetic operations” idea is totally incapable of giving a general answer like ∫cos=sin, or recognise some Green’s-Theorem-type reduction so that a few quintillion computational steps don’t even need to be performed.

The exterior calculus — invention of Élie Cartán — appears to be the right set of principles to object-orientedly program to do the analog-of-symbolic-computation for curves and surfaces.

It makes sense to me. If you have a plane curve with essentially two parameters, why does it need to be represented as an arbitrarily long array of points=pairs? You should be able to get arbitrary precision (just like a symbolic calculator does) with just the two parameters, if the reasoning system you programmed reasons the right way.

Overload dem operators with curve+curve= and surface×curve=, the wedge operator’s leading the way.

June 23, 2012

In philosophical debates about absolute truth, people cite “the truths of pure mathematics” as beyond reproach—eternal and universal things discovered/invented by us fallible mortals. But the more deeply I look into these issues myself, the more I see evidence that mathematics is not as stable as I’d supposed:

  • constructivists and intuitionists argue that the foundations of mathematics don’t make sense
  • logicians accuse mathematicians of not being rigorous enough
  • mathematicians themselves admit they totally ignore foundational issues and just concentrate on getting interesting results that make sense within their set of assumptions and could probably be “straightened up” to satisfy the logicians
  • Bill Thurston referred to mathematics itself as a social entity — it is the dynamical creation of a community, it lives inside the heads of the people who prove these things and not on paper.
  • John L Bell and Geoffrey Hellman: “Contrary to the popular (mis)conception of mathematics as a cut-and-dried body of universally agreed upon truths and methods, as soon as one examines the foundations of mathematics, one encounters divergences of viewpoint and failures of communication that can easily remind one of religious, schismatic controversy.

Norman Wildberger thinks real numbers have been a wrong turning in mathematics. He also claims, here in the video above, that angles θ are illogical. (Or maybe I should say, certain angles are used illogically.) Some angles, like 60°, can be constructed via ruler and compass. But other angles like 34° and 26° are not constructible.

So although “I know what you mean” when you talk about a real number or an angle that measures 90.1°, maybe we should both recognise that they don’t really make sense and speak in air quotes.


Related but different. On the topic of left-brains, right-brains, closed-minds, and open-minds in science. You can see youtube user njwildberger being beaten up on the XKCD forums for suggesting such unconventional and—ick!—philosophical ideas. Listen to these self-satisfied, smarter-than-thou sabelotodos savaging the “ridiculousness” of someone who would undercut this Well Established Knowledge.

I find that incredible because XKCD’s vision of science seems to be about open-mindedness, learning from data, and accepting the truth based on logic rather than tradition or popularity.

The Data So Far


OK, “data” needs to be replaced with something else in theoretical maths. But you could at least listen to what the guy’s saying rather than his credentials or his sweatpants. (Conversely: if John Conway says it, does that make it true? He gives talks in sweatpants as well.)

I bet ≥ some of these know-it-alls have lauded Galileo for smashing the accepted wisdoms handed down from Aristotle with cold, hard logic. What’s the difference to making fun of njwildberger because he’s suggesting something weird or unconventional? Prima facie it makes sense to me.

Maybe you don’t care about the foundational issues (isn’t that called hand-waving elsewhere?), or maybe you can disprove what he’s saying—but this PageRank 7 site is just attacking him rather than his idea. (For example they look at his publication record to see if he’s “someone we should take seriously”.)

You want to know why people aren’t interested in science? I think it’s in part because science and maths is associated with such stuck-up, judgmental people—putting down everyone who’s less “intelligent” than they are.

June 7, 2012

Harmonic and Circular Oscillation by quantumaniac via dataanxiety

I can’t find enough illustrations explaining phase space. Phase space is a space people make up with their minds. The fact that a real thing bobbing up and down harmonically along 1-D is equivalent to a circle (seems like 2-D? but a topologist would say S¹ and in fact we don’t use the interior of the circle at all) is such a huge mental leap forward, I can’t express how much it amazes me or how much potential I think this metaphor has for everything else.

Think about the space of solutions of Rubik’s cube. Physically the cube is what it looks like:
but paths toward the solution are like a high-dimensional pyramid with “solved” at the top and entropy at the bottom.

Rather than being “just a plain-old (ordered){red,blue,orange,yellow,green,white}⁹ with one particular configuration (starting point / solved) called the “centre”, all of that space is equivalence-classed by ,

  • since some orientations can only be obtained by switching the stickers and not by legal moves,
  • and since some members of (ordered){red,blue,orange,yellow,green,white}⁹ are actually just setting the cube down on the table differently rather than twisting it. (and therefore equivalent-in-that-sense) 

Anyway the Rubik’s Cube is a “knot” in phase space but nothing like a knot in right-in-front-of-you vision.

Here’s another example: if you have a Mac, you can invert the system’s colour scheme (presuming colours can be 2-inverted rather than in-in-in-verted or triverted but that’s another story) by pressing Alt + Command + N.

You can also 2-invert the colours of just one window (not the system) by pressing Alt + Comand + M. If you accidentally hit M,N, instead of N, your computer’s colour scheme would be 1-tangled. Or if you hit M on the wrong window, then hit N, then switched to another window and hit M, it would be even more tangled. But you can tell me that doesn’t make any sense! A Mac has a flat 2-D screen, with panes on it. Where do these “tangles” come from? It’s just some electronic signals zipping around and lighting up an LCD. Well, in phase space, in this particular mental representation which we can communicate about, it can be tangled.

Distance between Words

June 4, 2012

Which pair is more different?

  • keyboard | keyb`ard
  • keyboard | keybpard
  • keyboard | keebored

Of course in mathematics we get to decide among many definitions of size and there is no “correct” answer. Just what suits the application.

I can think of two approaches to defining distance measures between words:

  • sound-based — d(Hirzbruch, Hierzebrush) < d(Hirzbruch, Hirabruc)
  • keyboard-based — d(u,y) < d(u,o)

Reading on online fora (including YCombinator, tisk tisk) the only distance functions I hear about are the ones with Wikipedia pages: Hamming distance and Levenshtein distance.

These are defined in terms of how many word-processing operations are required to correct a mis-typed word.

  • How many letters do I need to insert?
  • How many letters do I need to delete?
  • How many letter-pairs do I need to swap?
  • How many vim keystrokes do I need?

and so on—those kinds of ideas.

inter-letter interaction effects

If we could get conditional probabilities of various kinds of errors — like

  • Am I more likely to mis-type ous while writing
    • varoius
    • precarious
    • imperious
  • ? There could be some kind of finger- or hand-based reason, like if I’ve just been using right-handed fingers near my ous fingers, or that I have to angle my hand weirdly in order to hit the previous couple strokes in some other word?
  • Am i more likely to mis-type reflexive as reflexible when the document topic is gymnastics?
  • Am i more likely to make a typo in google if I’m typing fast?
  • What if you can catch me mis-placing my hand on the homerow/ how dp upi apwaus fomd tjos crazu stiff? That’s almost like just one error. (It’s certainly less distance from the real sentence than a random string of characters of equal length.)
  • Or if I click the mouse in the wrong place before correcting my spelling? d(Norschwanstein, Ndorschwanstein) or d(rehabilitation, rehabitatiilon)
  • Am i more likely to isnert a common cliche rather than what i actually mean after a word that begins a common cliche/

A Bit Of  Forensics

EDIT: Once I got about halfway throguh this article, I stopped correcting my typoes, so you can see the kind that I make. I was typing on a flat keyboard, asymmetrically holding a smallish non-Mac laptop (bigger than an Eee) with my elbows out, head down — except when I type fast and interchange letters, with perfect posture, “playing the piano” with my ten finger muscles rather than moving my wrists — at an ergonomic keyboard with a broken M. I actually don’t recall which way i wrote this article. I may hav eeven written it in shifts.

Here are some nice ones as well. Look at the comments section. By the posting times (and text) you can see that the debate was feverish—no time for corrections and the correspondents were steamed up emotionally. Their typoes really have personalities—for example Kien makes a lot of errors with his right middle finger moving up. (did → dud, is → us, promoted → promotied, inquisition → iquisition, mean → meaqn, Church → Chruch, because → becuase, Copernican → Ceprican, your → you, clearly → cleary) but also some errors of spelling with no sound-distance (Pythagoras → Pythagorus) and uses both the sounds disingenious and disingenuous. Letter-switching, ilke I do, is common; a few fat-fingers (meaqn) or forgotten letters, but this iou stuff seems unusual and possibly characteristic of something.

Other participants make different sorts of errors, or at least with different frequencies (they’re relatively more likely to omit or switch letters than to use the wrong letter, for example). But let’s just focus on Ken because so many errors of the typoes are localised to that right middle finger. I wonder if Ken has a problem with that finger? Or maybe his keyboard is shaped in such a way that it’s difficult to correctly strike those keys specifically? (Maybe certain ergonomic keyboards would fit this — or an Eee Pc with the elbows out and “pigeon-toed” hands. But why would the errors then be localised to the right middle finger? It’s more mobile than pinky & ring fingers and we’re not taught to stick it to the homerow like the index finger.) I rule out the theory that his right hand hovers above the keyboard rather than sitting on the homerow because then he should make similar errors with yuiop and maybe bnm,.hjkl; as well. Also, notice that he doesn’t make comparable errors with ewr as with iou. How do we know he sits symmetrically? I have a tough time deciphering why there are more errors with that finger on a first read-through.

We could find more of Ken’s writing here and see how he types when he’s less agitated. I bet there are no Ceprican’s there but Pythagorus would still be. As for Chruch? Hmmm. Don’t know.

Big Data vs Models

Now the big-data-ists (the other half of Leo Breiman’s partition of statistical modellers -vs- data miners) would probably say “Google has a jillion search results including measurements of people correcting themselves and including time series of the letters people type — so just throw some naive Bayes at that pile and watch it come to the correct answer!” Maybe they’re right.

If someone wants to mess around with this stuff with me — leave me a comment. We could grab tweets and analyse typoes within differnet text-…[by which tool] was used to send the tweet. For example the Twitter website means it was keyboard-typed, certain mobile devices have Swype, other errors we might be able to guess tha tis …[that it’s] a T9 mobile keyboard.

  • Could we tell if a person is left-handed by their keyboard mistkaes?
  • Could we guess their education level/
  • Could we tell what tweeting platform they used by their errors rather than by 
  • Could we tell where they’re from? Or any other stalky information that advertisers/HR want to know but web browsers want to hide about themselves? (Say goodbye to mandatory drug testing in the workplace, say hello to your boss getting an email when a statistics company that monitors your twitter feed guesses you smoked pot last night based on the spelling and timing of your Facebook posts.)

May 31, 2012

Calculus is topology.

The reason is that the matrix of the exterior derivative is equivalent to the transpose of the matrix of the boundary operator. That fact has been known for some time, but its practical consequences have only been understood recently.

[S]uppose you know the boundary of each k-cell in a cell complex in terms of (k−1)-cells, i.e., the boundary operator. Then you also know the exterior derivative of all discrete differential forms (i.e., cochains). So, you know calculus. Smooth or discrete.

Peter Saveliev

Climate Statistics

May 21, 2012

httpness: (studying statistics) Can there be a different standard deviation up and down?
isomorphisms: Yes. it’s called a semideviation. (Or a quasinorm.) There are a lot of people who argue that semideviations and quasinorms are more natural than standard deviation and norms.
httpness: So that’s not a normal distribution?
isomorphisms: Whatever distribution you’re using, there are different measures of dispersion on that — standard deviation, downside risk / semideviation, interquartile range, kurtosis, etc.
httpness: I was just thinking about temperatures. The standard deviation changes depending on the time of year, and the chance of unseasonably warm or cold days changes too.
httpness: Here’s an example of what I mean. let’s say during the summer there _is_ a standard deviation and it’s the same up and down. But at another time of year there could be more chance of a very warm day, and at a third time of year there could be more chance of an unseasonably cold day.