Array notation is a kind of notation for large numbers. Comparing to combinatoric functions, array notation has more complex definition, but easier to evaluate. We’ve known some array notations, such as Bowers’ exploding array function (BEAF), Bird’s array notation (BAN), extensible-E system (ExE), and hyperfactorial array notation (HAN).

Here I bring a **strong** array notation, and it goes far away from the notations mentioned above. It’s made up of several parts shown below.

The following 7 parts make the array notation as strong as R function, and they’re called the **first generation**.

- Linear array notation (LAN)
- Extended array notation (exAN)
- Expanding array notation (EAN)
- Multiple expanding array notation (mEAN)
- Primary dropping array notation (pDAN)
- Secondary dropping array notation (sDAN)
- Dropping array notation (DAN)

On Dec 26, 2017, the rules (for the condition into the “adding rule”) from pDAN to DAN were updated, avoiding the recursion level platform.

The following 5 parts are my first attempt to go beyond R function (i.e. the **second generation**), but they don’t work.

- Nested dropper array notation (NDAN)
- Dropper-expanding notation (DEN)
- Multiple dropper-expanding notation (mDEN)
- Second-order dropper-expanding notation (soDEN)
- Higher-order dropper-expanding notation (hoDEN)

The following 5 parts are my second attempt to second generation, but they still don’t work.

- Nested dropper array notation (NDAN)
- Weak dropper-expanding notation (WDEN)
- Multiple weak dropper-expanding notation (mWDEN)
- Weak dropper-multiexpanding notation (WDmEN)
- Primary dropper-dropping notation (pDDN)

Hypcos which part of your Strong Array Notation has the growth rate of ordinal BHO?

LikeLike

Expanding array notation

LikeLike

Hypcos your Strong Array Notation has the same growth rate as Taranovsky’s ordinal notation. Is this correct?

LikeLike

I don’t know, because I haven’t fully compared them. The comparisons are quite complicated.

LikeLike

What’s the difference between Ordinal notations, Ordinal collapsing functions, and the Fast-growing hierarchy? What’s the difference between the 3? Are there any functions that have the growth rate of Aleph-one (N1)? The first uncountably ordinal.

LikeLike

Fast-growing hierarchy accepts a countable ordinal and output a function from natural number to natural number.

Ordinal collapsing function accepts several ordinals and output an ordinal. It’s a kind of ordinal notation.

Ordinal notation is some notation to express ordinals.

FGH only accepts countable ordinals, so function with growth rate (here growth rate is defined using FGH) aleph_1 doesn’t exist.

LikeLike

Is multiplication a function or an algorithm?

LikeLike

What is the main difference between computable functions and uncomputable functions? Is it that computable functions use finite time and space and uncomputable functions use infinite time and space? Is this the reason?

LikeLike

A Turing-computable function can be simulated in finite time by a Turing machine. An uncomputable function is a function which is not computable.

LikeLiked by 1 person

So for instance, the function f(3) = 3^^^^10 is computable, because a Turing machine could take 3 as input on it’s tape and produce the value (output) of 3^^^^10 in finite time? An uncomputable function goes beyond finite time?

LikeLike

Yes, the first number is computable. As for your second question, even though there exist constructs such as infinite time Turing machines (ITTMs), there still exist functions even those cannot compute, such as, e.g., the ITTM Busy Beaver function.

LikeLike

I think I’m starting to understand. Turing machines can compute functions, because of their infinite memory tapes and runtime, but Turing machines can not compute themselves. A nth level Turing machine can not go beyond it’s self or simulate it’s self.

LikeLike

Well, there do exist universal Turing machines which can simulate all other Turing machines, but none can, say, solve the Halting Problem. This is the case with most TM derivatives; for many, you can build a version of the Busy Beaver function which neither it nor less powerful machines can compute.

LikeLike

Are the states of Turing machines in the same category with hyperoperators like +,-,÷,*, and up arrows?

LikeLike

Although every hyperoperation can be simulated by some Turing machine, not every function computable by Turing machines can be represented only with hyperoperations.

LikeLike

So a Turing machine could compute the function f(10) = agoraphobia = E100 &(&(&(…&(&(&({#,#,1,2})))…)))100 w/100 &’s (Sbiis Saibian’s number), because it has an infinitely long tape in both directions (memory storage space) and infinite runtime. It just needs enough states to do so. The states are the Turing machine’s source of computational power. Is this correct?

LikeLike

A Turing machine must halt in a finite amount of time, but the main power of the Turing machine lies in its arbitrarily long tape.

LikeLike

How does the Church-Kleene ordinal relate to Turing machines?

LikeLike

If a Turing machine were to compute the function f(3) = Graham’s number, does that mean that a Turing machine would take 3 consecutive 1s as an input, then operate for a finite number of steps, then halt with a Graham’s number of consecutive 1s left on it’s tape?

LikeLike

It depends on your I/O encoding scheme; Turing machines by themselves don’t have any notions of this.

LikeLike

How do Computable functions relate to the Halting problem?

LikeLike

I would like to ask a few questions about functions and algorithms. So addition is the function and add or adding is the algorithm, multiplication is the function and multiply or multiplying is the algorithm, exponation is the function and exponate or exponating is the algorithm, tetration is the function and tetrate or tetrating is the algorithm. Is this correct?

LikeLike

What is the limit of SAN with your plans?

LikeLike

Does anyone know how many symbols it would take to define (pDDN) in the language of First Order Set Theory (FOST)?

LikeLike

maybe 200.

LikeLike

Hypcos how many levels of (pDDN) would it take to reach Rayo’s number?

LikeLike

Uncomputably many levels. All these notations are actually computable.

LikeLike

Even BIG FOFT, okay how about the limit of pDAN?

LikeLike

SAN can be extended: +’ is dropper-droppers for ::::…+ and ++ is dropper-droppers for +::::…, then we can defined nested droppers for +++…., like (N^+)

LikeLike

Hypcos what are the circles (o) used for in your Strong Array Notation?

LikeLike

The circle is a dropper in NDAN. Read the NDAN page for more details.

LikeLike

Hypcos which part of your Strong Array Notation has recursion level psi(OMEGA omega)?

LikeLike

The limit growth rate of mEAN is ψ(Ω_ω).

LikeLike

So that is comparable to legion array space?

LikeLike

BEAF is not well-defined beyond tetrational arrays, and so is legion array space.

LikeLike

Okay.

LikeLike

Hypcos which part of your Strong Array Notation has the Large Veblen Ordinal (LVO)?

LikeLike

Separator {1{1`1`2`}2} has recursion level LVO, and first appears in expanding array notation (EAN).

LikeLike

Hypcos how would you describe TREE(3) with your Strong Array Notation?

LikeLike

The value of TREE(3) is too hard to work out. It has a lower bound of s(3, s(3, 262138 {1 {1`1,2 `} 2} 2) {1 {1`1,2 `} 2} 2 {1 {1`1,2 `} 5} 2). However, its upper bounds are much weaker, because we don’t know the growth rate of TREE(n). We just know TREE(3)<SSCG(3), where SSCG(n) grows slower than s(n,n{1,,2,2}2).

So s(3, s(3, 262138 {1 {1`1,2 `} 2} 2) {1 {1`1,2 `} 2} 2 {1 {1`1,2 `} 5} 2) < TREE(3) << s(3,3 {1,,2,2} 2).

LikeLike

Cool.

LikeLike

Hypcos how would you describe a Golapulus (10^100 & 10 & 10) with your Strong Array Notation?

Where would a Golapulus be in your Strong Array Notation?

LikeLike

Impossible

LikeLike

Hypcos how would you describe a 10^100 array of 10s array of 10s with your Strong Array Notation?

LikeLike

That would be approximately s(10,s(10,s(10,100){2}2){2}2) in exAN.

LikeLike

You are wrong. & isn’t defined.

LikeLike

Hyp cos how would you express Meameamealokkapoowa Oompa with your Strong Array Notation?

LikeLike

Meameamealokkapoowa Oompa is not well-defined. What’s more, Bowers’ googolisms beyond triakulus are not well-defined.

LikeLike

Okay.

LikeLike

So we express BEAF up to {1,,2} separator?

LikeLike

Indeed. {1,,2} is the same level as tettrational BEAF arrays.

However BEAF and SAN are not equivalent at that point.

LikeLike

Why you ran out of ideas about extensions?

LikeLike

Aarex I am a huge fan of your work with large numbers. I have been studying your work with large numbers for years. Could you show me the steps to go from Otoperoogol to Otopersuperoogol? These numbers are listed on your Extensible Up-arrow System part 4 Dimensional/Nested Array Up-arrow webpage.

LikeLike

My old googology site is closed, silly you.

LikeLike

The question was asked before you closed it, silly. And I’m sure he’ll find the time to bring back the extended arrow notation some time in the future…

And he’s made primary dropper-dropping notation, part 12. Whether he plans to make a part 13, I don’t know.

LikeLike

I have the idea on pDDN notation, going beyond than +.

LikeLike

What’s the definition of the iterator in exAN?

LikeLike

Iterator is defined as the 2nd entry (not in separators) in the array.

LikeLike

Cant you like, fix it?

LikeLike

It seems like he already has. (or at least tried to) The new NDAN is a replacement for the old one, and WDEN and mWDEN are analogues to DEN and mDEN. WDmEN seems to go a different direction from soDEN and hoDEN; perhaps he sees this as a better/stronger way to extend things.

LikeLike

Can you make the new version of DEN? (sDEN)

LikeLike

Can you explain in more detail why the latter five parts don’t work?

LikeLike

In NDAN, there is the problem cause infinite loop.

LikeLike

Still occurs in his second attempt. I have second thoughts.

LikeLike