What Was The True Origin Of The Biblical Cubit?


What Was Its Original Length? - by Kenneth G. Parker

Updated January 2011

How was the Cubits length originally determined?  Was there a scientific basis for the Cubit?  Some individuals claim that the origins of the Imperial system lay in ancient Israel and that the British peoples may owe their original system of measure to that nation.  Was the original English system of measure which was considered an unscientific system, related to the cubit and was the English system really "unscientific" in its origin?  It is true that the legislated standards of English measure have varied from time to time but does this prove that the origin of the system was “unscientific”?

The Hebrew standard or basic measurement was the Biblical Cubit which is shown in Table 1 to be 21 Imperial inches long.  Metrologists have reckoned the Biblical Cubit at anything between 16 and 25.2 Imperial inches.

Let’s take a look at a possible derivation of the length of the ancient cubit.  John Mitchell (City of Revelation) writes "The Greek Cubit or Ell, made up of 25 digits, was a length of about 18 1/4 inches, which may be 1.52 or 1.53 feet".  Another writer, Ian B. Patten of Anchorage Alaska, claimed 1.520 ft. or 18.24 inches as the length of the Egyptian Cubit and claimed it was calculated from the Earth's rotational velocity at the equator.

We can calculate the Earth's rotational velocity at the equator in feet per second.  In the following calculation 24,902.44 is the circumference of the earth at the equator in miles, 5,280 is the number of Imperial feet in a statute mile, 24 is the number of hours in the day and 60 x 60 is the number of seconds in an hour:

            Working this simple calculation shows that

                24,902.44 x 5,280 =  1,521.8158
                    24 x 60 x 60

1521.8158 is the Earth's rotational velocity at the equator in Imperial feet per second

Dividing the above answer by 1,000 we arrive at 1,521.8158= 1.5218 Imperial feet as the Earth’s rotational
                                                                                   1,000

velocity at the equator per second.  Assuming that this was the origin of the Egyptian cubit, it would definitely have a value of 1.5218 Imperial feet or 18.26 Imperial inches. Rounding the answer to 1.52 ft. the Egyptian cubit would have a value of 18.24 Imperial inches.

The following Table 1 is a record of ancient Hebrew measures of length taken from the complete works of Josephus, the ancient Jewish historian, Kregel Publications 1964, page 727.

Table 1

TABLE OF THE OF THE JEWISH MEASURES OF LENGTH.

Measure

Cubit

Inch

Feet

Inches

Cubit, the standard

 

21

1

9

Zereth, or large span

1/2

10.5

0

0

Small span

1/3

7

0

0

Palm, or hand breadth

1/6

3.5

0

0

Inch, or thumb's breadth

1/18

1.16

0

0

Digit, or finger's breadth

1/24

.875

0

0

Opyvia or fathom

4

84

7

0

Ezekiel's Canna or reed

6

126

10

6

Arabian Canna, or pole

8

168

14

0

Schoenus line, or chain

80

1680

140

0

Sabbath-day's journey

2,000

42,000

3,500

0

Jewish mile

4,000

84,000

7,000

0

Stadium, or furlong, 1/16

400

8,400

700

0

Parasang

12,000

252,000

21,000

0

See also Cumberland's Weights and Measures, p. 57, 58, 135, 136.

            If we assume that the Biblical and Egyptian cubits were equal in length, we can substitute 18.24 inches for the 21 inches shown as the length of the Cubit in Table 1 and by multiplying 18.24 inches by the figures listed in the second column of Table 1 arrive at the revised lengths of the various measures as shown in Table 2.

Table 2

RECOMPUTED JEWISH MEASURES OF LENGTH.

Measure

Cubit

Feet

Inches

Cubit, the standard

 

1.52

18.24

Zereth, or large span

1/2

0

9.12

Small span

1/3

0

6.08

Palm, or hand breadth

1/6

0

3.040

Inch, or thumb's breadth

1/18

0

1.0133

Digit, or finger's breadth

1/24

0

0.76

Opyvia, or fathom

4

6.08

0

Ezekiel's Canna, or reed

6

9.12

6

Arabian Canna, or pole

8

12.16

0

Schoenus, line, or chain

80

121.6

0

Sabbath-day's journey

2,000

3040

0

Jewish mile

4,000

6,080*

0

Stadium, or furlong, 1/16

400

608

0

Parasang

12,000

18,240**

0

NOTE:

*The length of 6,080 feet arrived at for the Jewish mile in our recalculation is the common conversion of the U.S. Nautical Mile.  The U.S. Nautical Mile is actually 6080.2 Imperial Feet.  The common length used for the Fathom is 6 Imperial feet, a difference of less than an inch from the fathom in Table 2.  The length of the Navy Cable is 100 Fathoms or 600 Imperial Feet, a difference of 8 feet from the Hebrew Furlong in the recalculated Table 2.  It is interesting to note that the value of the Hebrew inch calculated in the above table is only .0133 inches different from the Imperial unit.

**The Parasang is shown to be 3 Nautical Miles.

Could the recalculation of the Hebrew mile coinciding with the common conversion value of the Nautical Mile indicate the two have a common origin?

Were the originators of the Hebrew system aware of and actually using the Nautical Mile (One minute of Longitude at the equator) as the basis for their measures?  It is interesting to note that the value shown for the Inch (1.0133) gives a foot of 12.16 Imperial Inches.

            An article by an unknown author found on the internet headed “Greek Foot” makes the following claims:    “The Greek stadia is known to be 600 Gk feet in length.  An important question to solve, therefore, is how long was the Gk and Roman foot?  Accurate calculations will lead to the relationship if any between the Greek foot and stadia length and a possible Egyptian geodetic connection.  Greaves from Cambridge in 1639 and also Isaac Newton investigated possible metrical associations.  Other scholars have also investigated see G. de Santillana and H. Von Dechend they wrote a very interesting work "Hamlets Mill".  They and others explored these ancient measurements, concluding that the length of a Roman foot was 11.664 inches and the Greek was 12.15 inches.  This conclusion has been supported to a high degree of accuracy by scholars of Greek architecture.  Measurements of the Parthenon both ancient and modern show it to have been constructed according to a principal of proportionate length and width.  Penrose and others showed the length of the Greek foot to be exactly 12.15 inches, a significant geodetic measure; and that the Parthenon proved its exact length.  Given that the ancient Greeks used 600 Greek ft for their Stadia the true length of the Greek Stadia is seen, therefore, to be exactly 607.5 ft (modern) or 1/10 of a Nautical mile.  In other words, the ancient Greeks knew a minute of arc at the Latitude of Athens was equal to 10 Stadia and indeed, our Nautical mile is defined as exactly 6075 ft, as 1/21,600 of the earth's circumference, that is, one minute of arc.  This could of course be merely a coincidence.  Is it also a coincidence that the Egyptian measure of the Remen is found to be in proportion to the Greek Stadia?  500 Egyptian Remen equal exactly one Greek Stadia.”

          Another unknown author in an online Encyclopedia article claims:    The foot as a measure was used in almost all western cultures and was usually divided into 12, sometimes 10 inches/thumbs or into 16 fingers/digits.  The first known standard foot measure was from Sumer, where a definition is given in a statue of Gudea of Lagash from around 2575 BC.  Some Metrologists speculate that the imperial foot was adapted from an Egyptian measure by the Greeks, with a subsequent larger foot being adopted by the Romans.  Some of the earliest records of the use of the foot come from the region of ancient Greece.  The originators devised, or perhaps borrowed from Egypt, the degree of longitude, divided the circumference of the earth into 360 degrees, and subdivided the degree for shorter distances.  One degree of longitude comprised 600 stadia. One stadion was divided into 600 feet.  Thus the degree of longitude measured 360,000 feet.  One mile was 10 stadia or 6000 feet.  This is essentially the same mile that was (or still is) used in the Western hemisphere, but the modern foot is longer than the original.  This could be explained by an ancient Egyptian measure of the degree of longitude made near Thebes compared to a redefinition of the length of the foot referencing the degree of longitude at the equator.  The difference in the length of the geodesic foot measured at these two locations would give the modern mile, 6000 ancient Greek feet or 10 stadia, and 5280 equatorial feet.]

The preponderance of evidence indicates that the Greek system of measure originally came through Noah, who used the cubit when constructing the Ark and the Biblical cubit and Imperial system of measure actually have a common accurate scientific origin based on the Earth's rotational velocity at the equator, contrary to popular belief!  Our modern concepts of ancient history need to be carefully and objectively restudied for accuracy!