the 'branch' proposal comments:
- not at all comfortable with this system, seems to defeat the consensus basis for wikipedia. consensus isn't reached by democratic process, we are not really trying to make an 'encyclopedia of what the majority wants' we are trying to make something much more potent, an 'encyclopedia of undisputed facts' . by voting or by the suggested branch system, we diverge from consensus at each branch or each vote. I appreciate the enormous amount of work involved in creating the idea, but I don't feel it is best for us all. My suggestion for edit wars is: if someone reverts your page, point it out to 2 other editors, and drop out of the argument.. if you can't stand to drop out of the argument, it's a sure sign of bias... I'll discuss this at length another time if you like, just drop me a line. And thanks for all the effort you've put into this, you obviously have the unselfish diligence that I see as essential to this project. drop me a line if you like.Pedant 11:08, 2004 Aug 18 (UTC)
Hello. Are you still active here? I had a disagreement with the metacompiler defination here. Mostly substantiated by historical metacompilers.
Anyway I see you were active in the PEG talk discussions. Maybe you can help on the metacompiler.
I have researched metacompilers and found websites using the FORTH description that is basically a self-hosting compiler that doesn't distinguishe them from any self-hosting compiler. Meaning by the FORTH metacompiler defination C++ written in itself compiling itself would be a metacompilet. And the same could be said of any self-hosting compiler. I think that FORTH is a specialized metacompiler because it defines itself in a subset of itself and programming in FORTH is extending the language creating dialects of the FORTH language. I am not a FORTH proponent. I basic came to that understanding reading references to the FORTH metacompiler my advocacy put forth in his argument of his claims that a metacompiler is a compiler compiler that compiles otself.
Most other references are to the Schorre metacompilers.
A few are not explained or named but referenced as used in the development of ...
The Metacompiler topic referenced Schorre metacompilers. Some of which were not used to compile them selves as was claimed the distinguishing feature of a metacompiler.
The Schorre Metacompilers are related to PEGs The main differences are implementation, backtracking and output. The backtracking in CWIC for example is controlled using different alternant (or) operators and are independent of the rule. There are actually have two levels of backtracking. CWIC has token rules that backtrack automatically. In syntax rules backtracking occure when input has advanced and a failure occures. A non-backracking alternate will not be tried.
X = A B / C;
will not attempt C if A succeeds and B fails. Written as:
X = A B \ C;
C will be attempted if B fails. Even further if A partially succedes and there is no nested backtracking alternate C would be attempted. The state would be restored to that of before A being called. Backtracking will return the parse state to the beginning of the left backtrack alternate and goto the right alternant. Metacompiler do not build a parse tree automaticly. They stack parsed atomic objects, symbols, numbers... They instead have transform directives as part of the langusge. CWIC builds an abstract syntax tree. Or it could be described as functional representation of the source. a + b × c would get translated to ADD[a,MPY[b,c]]. It can be looked at as a nested functional notation or a tree structure. It is actually implemented as a list whose first element is a node. LISP 2 is a part of CWIC.
They are not parser generators but are compiled into executable code.
I think these metacompilers are distinguished more by the implementation, being compiled into executable code and including the transformation to object code programmed in the metacompiler's metalanguage.
But I also wont to be consistent with current terminology. For example they use a metasyntax defined metalanguage. That is they are compilers that translate a metalanguage into executable code. And the metalanguage thay translate is defined by a metasyntax. The metasyntax can be coded in the same metalanguage it defines. The metalanguage defines the language syntax and transformation to object code.
Look at metacompiler edit on my talk page. PEGS are not as powerful as the Schorre line of metacompilers developed in the 1960s. CWIC is far more advanced. It parses to an abstract syntax tree and then using unparsing rules can traverses and analyze the abstract tree producing optimized code. I includes a full implementation of LISP 2 as the action of an unparse recognition. METAII is a toy compared to CWIC. Steamerandy (talk) 23:37, 27 December 2014 (UTC)
Hi. Thank you for your recent edits. An automated process has detected that when you recently edited Consensus (computer science), you added a link pointing to the disambiguation page Paxos. Such links are usually incorrect, since a disambiguation page is merely a list of unrelated topics with similar titles. (Read the FAQ • Join us at the DPL WikiProject.)