Deprecated: Array and string offset access syntax with curly braces is deprecated in /homepages/21/d38531796/htdocs/jose/smfforum/Sources/Subs.php on line 3825
Print Page - Improvements in Programming Languages

Theo's Forum

IT-Consultant: Charles Pegge => Software Design and Development Issues => Topic started by: Charles Pegge on May 12, 2007, 10:38:30 PM

Title: Improvements in Programming Languages
Post by: Charles Pegge on May 12, 2007, 10:38:30 PM

There are over 8000 recorded programming languages, past and present, so inventing them is not that difficult.
With many of these languages, just about any computation is possible, but with varying degrees of difficulty and complexity.

What makes a language easy to use? Will it compile and execute efficiently? Is it easy to read, - to trace errors or modify? Can the language be used to develop itself or generate new code?

What are the strengths and weaknesses in moderm programming languages like C++ BASIC Java, Python or XML?

Is it possible to take the best features of all of these languages and meld them into one unified system that is not deficient in any quarter? Would the result be a monster?

These questions, I believe are answerable, when it comes down to specifics, and the possibilities of simplification will become apparent.

This thread is for some dreaming along this theme, and since this site is mostly about PowerBasic, that will be a good starting point.


Title: Re: Improvements in Programming Languages
Post by: Charles Pegge on May 13, 2007, 12:34:24 PM

BASIC and C

Developed in 1963 as a computer science teaching aid, Beginners All-purpose Instruction Code, was designed to make short, easy to understand code that could be picked up rapidly by people with very limited experience.
because it was a compact but complete language, it was widely adopted as a standard for home computers in the late seventies, usually on 8 to 16kbyte ROM. Turn on the computer, and up came the BASIC editor after about 1 second.

With the rise to dominance of the Microsoft based PC, C/C++ became the language of professional software developers and BASIC, as a standard language, supplied with the PC was left to languish in its primitive form.

This gave the opportunity for other developers to come in, but resulted in divergent versions of the language.
Turbobasic, later to become Powerbasic  made its appearance in the late 80s, as an exceptionally efficient and well specified compiler.

In the 90s Windows and its graphical user interface displaced most MSDOS applications and drove the development of BASIC to be able to interface the complex operating system calls needed to drive windows, and interface with windows applications. This required the adoption of a number of C type constructs eg pointers, and passing parameters by value.

But While Powerbasic can do most of the things that C does, it has not gone all the way to acquire fully fledged C++ capabilities. Judging by the misuse of C++, resulting in the massive inflation of code size in Windows based systems, and tangle of complexity, this is quite understandable. Even the C++ Stream library which basically manages strings anf files adds 500k of code to the executable. But C++ has many virtues, that do not lead inevitably to monstrous code.

Anyway here is a cursory list comparing the two languages.

The Virtues of modern compiled BASIC

Keywords with obvious meaning.
Block structured programming.
Rich Kernel of Functions.
Built in Strings handling.
Built in Memory management / automatic garbage collection.
Macros
contains most of the functionality of C

The virtues of C++

Scoping of variables in namespaces and blocks.
Initialisation of variables at declaration.
Encapsulation.
Overloading.
Object Oriented Programming: Inheritance, classes and objects.
elemental enough to form basis of higher level languages.
widely used across many platforms.


Some BASIC vices

verbose syntax.
ad hoc syntax eg File operations.
The name itself: BASIC is sophisticated not basic.


Some C++ vices

confusing use of symbol combinations.
multiple meanings of curly braces - difficult to read.
features leading to bloated code: Templates.
no automatic garbage collection.
lack of intrinsic functions - just about everything requires a library / include.

Combining the languages

Powerbasic already has most of the C capability, using different words/symbols, therefore adopting C++ extensions would not mangle the syntax in any way. In fact it would produce a much clearer expression of the logic by dispensing with many of the confusing symbol combinations, overused in C/C++.


Some weakness in both BASIC and C++

missing interpretive or Just-In-Time compilation capability.
cannot support Functional Programming paradigm.

These are weaknesses that are inevitable in any purely compiled language. For the flexibility that is demanded by many situations, an interpretive layer is required or a Just-In-Time compiler as part of the run-time system. One of the causes of bloating in C++ is trying to make a statically compiled language look flexible. - all variations in parameter types have to be catered for prior to execution. The same is true for BASIC but its generous built-in string handling capabilities, make interpretative operations much simpler to accomplish.

...to be continued...
Title: Re: Improvements in Programming Languages
Post by: Charles Pegge on May 14, 2007, 06:26:15 AM
SIMPLIFYING PROGRAMMING LANGUAGES

Removing unnecessary syntax and structures, makes a language easier to learn and easier to check for errors.

Some radical ideas!

Operator precedence

by observing strict left to right evaluation of an expression, all ambiguities are removed for a minimal cost of a few extra brackets and all ambiguities are removed.
Compilation is simplified.

Control Structures

IF ELSEIF ELSE CASE FOR WHILE REPEAT REDO EXIT can all be replace with a unified construct which simplifies logical checking. It goes something like this:

{
 if a then exit
 if b then
  ...
 end if
 if c then repeat
 if d then goto label
}

..
label:

The curly braces delineate a block.
There is a single line 'if..then' and a multi-line 'if .. then .. end if.'
'exit' forces an early exit to the end of the block.
'repeat' directs the execution back to the beginning of the block
'goto' can be used specifically to jump out of nested blocks or to jump over other blocks.

Thus Occam's razor has been applied and no other control structures are needed.

Not only does this make programming logic cleaner, it also simplifies compilation, which is
essential if Just-in-Time methods are used.

...
Title: Re: Improvements in Programming Languages
Post by: Charles Pegge on May 14, 2007, 11:18:03 AM
< :)> SUPPORTING MARKUP </ :)>

Reserving <> Brackets

There are not enough bracket types in the basic ascii character set to do all the things we need to express. In particular, there is a fundamental conflict between the inequality symbols and brackets used in markup languages, and other uses. Syntactically the simplest way to resolve this is to do away with the inequality symbols and replace them with assembler-like mnemonics thus:

LT <
LE <=
GT >
GE >=
NE <>

also

EQ ==
This does away with the confusion between 'equality' and 'assignment' by reserving '=' exclusively for the latter.


if (a GE 42)and(b EQ 80) then c=4
 
As you can see, the solution is not verbose and in my view reads better.

We now have a pair of symbols which are available exclusively for use as brackets, and no longer need to continually switch contexts. And one of the main uses for these brackets is Markup expressions.


Using markup ro define objects and complex data structures.

It goes without saying that life on the internet without markup languages is almost inconceivable. Within procedural languages too, they could be used for declaring, building and manipulating objects and data structures.

In an interpretive language, objects can be represented in a string, for example:

screwA="_
 <type> screw</>
 <material> steel</>
 <coating> phosphate</>
 <diam>#3.5</>
 <length>#38</>
 <thread> double</>
 <head>
   <shape> bugle</>
  <top> posidrive</>
 </head>
"

Its elements are referred to like this:
 srewA.length
 screwA.head.shape

New objects are created simply by copying the string content:
 new screwB=screwA

which in turn may be modified in several ways:

  screwB.head.top="slotted"   changing a property

  screwB+="<cost>#0.02</>"  adding a property

  screwB.head+="<diam>#6.0</>"  inserting a property

  screwB.head=""  removing a group of properties

If necessary code can also be efficiently embedded in a markup field.

What is proposed here is a very simple form of markup language where attributes are not used within tags. The tags only contain names. And the end tag may or may not contain the name. That is just a matter of clarity.





....
Title: Re: Improvements in Programming Languages
Post by: Charles Pegge on May 15, 2007, 01:51:55 PM
PASSING AND RETURNING PARAMETERS


Functions with default parameters

function CreateWindowX( x=100, y=100, width=512, height=256, sysmenu=0 )
...
end function

various ways to call the function:

  CreateWindowX( )  use all default values

  CreateWindowX( 200, 200 )    set x and y positions only

  CreateWindowX( , , 250, 250 )    set width and height only

  CreateWindowX( sysmenu=1 )    use all the defaulr values except for sysmenu

  CreateWindowX( width=800, x=50 )  use all default values execpt for x and width

Passing blocks of data by reference

For sets of contiguous data.

new a[100]=-1   Create an aray of 100 numbers of value -1
a[10]=1,2,3,4,5,6,7,8,9,10 now put some numbers into the array starting at a[10]
lookat(ref a[12])

...

function lookat(p)
 p[0]?
 p[1]?
 p[10]?
end function

results:
12
13
-1



end function

Returning data by reference

If a block of data is passed by reference, it is possible to write back into any of the elements.
To ensure clarity of intention in the code, any variables that were not created within the function,
should be preceded by set before they are written to.

example:

new vector[100]

vector[1]=1,2,3
addv(ref v[1],2,4,6)
$ Result: `v[1]` `v[2]` `v[3]`  Result: 3 6 9
...

function addv(v,a,b,c)
 set v[0]+=a; v[1]+=b; v[2]+=c
end function






..to be expanded..

Title: Re: Improvements in Programming Languages
Post by: Eros Olmi on May 16, 2007, 09:08:52 PM
Charles,

very very interesting discussion about programming languages.
I will consider some of your suggestion in thinBasic: default values for parameters passed BYVAL and mnemonics for logical operations

I will continue to follow your thoughts on that.
Thanks a lot
Eros
Title: Re: Improvements in Programming Languages
Post by: Charles Pegge on May 17, 2007, 06:25:41 PM
Thanks Eros,

I am testing these ideas as I go, on an experimental  scripting language. The current version is in FreeBasic, running on Linux and Windows. The logic around default parameters can get quite complex.

If you would like to see a work in progress, its about 50k source, 100k compiled, being tested and updated on a daily basis.

   http://www.pegge.net/xfers/run.bas
   http://www.pegge.net/xfers/run.exe
   http://www.pegge.net/xfers/main.pro

  sample extension library

   http://www.pegge.net/xfers/module.bas
   http://www.pegge.net/xfers/module.dll

  type run at the console

I hope to do a proper release soon!
Title: Re: Improvements in Programming Languages
Post by: Eros Olmi on May 17, 2007, 06:33:34 PM
Thanks a lot Charles.
I will for sure test it this night when back to home.

I'm happy to say I've already implemented default values for all parameters passed BYVAL in thinBasic functions. It is a great addition and works perfect.
You have great ideas.

Ciao
Eros
Title: Re: Improvements in Programming Languages
Post by: Charles Pegge on May 17, 2007, 10:32:30 PM
Simple Object Oriented Programming

This is a very simple scheme that does not assume a taxonomic tree of classes but will support one if required.

   All objects are stored in markup strings

   methods or functions are related to their objects by the <type> specified in the object.

  functions may defer to 'parent' functions or any other functions  explicitly

  objects calling functions pass an invisible parameter refering to themselves called this



Taking the previous example of a data structure slightly extended:

new DryWallScrew="_
 <type> screw</>
 <material> steel</>
 <coating> phosphate</>
 <diam>#3.5</>
 <length>#38</>
 <thread> double</>
 <head>
   <shape> bugle</>
  <top> posidrive</>
 </head>
 <supplier>
  <name> Screwfix</>
  <price>#0.01</>
 </supplier>
"
new ScrewA=DryWallScrew

ScrewA.buy( quantity=1000 )

..

function screw.buy hardware.buy //linking function to another type or class

...

function hardware.buy(length=0,quantity=1)
 new t=length
 if not t then t=1
 t*=quantity*this.supplier.price
 $ Supplier `this.supplier.name`
 if length then $ Length `length`
 $ Quantity `quantity`
 $ Unit price `this.supplier.price`
 $ Total price `t`
 ...
 return t
end function


..to be expanded..
Title: Re: Improvements in Programming Languages
Post by: Eros Olmi on May 18, 2007, 07:05:29 AM
I'm sorry Charles but here I start not to follow. I did some OOP and have some OOP theory but never get too much deep.
I need some time to study your RUN sources and get the inner data connections.

For the moment I will stole some of your ideas (sorry :D )
I have already added some in thinBasic: http://community.thinbasic.com/index.php?topic=890.0
Will continue to follow your post here. Also interested in FreeBasic. We are making a thinBasic SDK in order to use FreeBasic as development environ for thinBasic modules so ... even more interesting this post.

Thanks a lot
Eros
Title: Re: Improvements in Programming Languages
Post by: Charles Pegge on May 18, 2007, 02:34:32 PM

Eros, I'm putting in a few more notes and cleaned up the example above. It is a very experimental form of OOP, the main innovation being the use of markup strings to define the object, but I think the only way I am going to grasp the subject is to come up with a system I would enjoy using. So I have ignored much of the c++ stuff.

My source code has very few comments, and some of the parsing logic is quite complex. It partially tokenises
and links the script at load time, but some of the linking happens later during run-time, and some of the script
is left untokenised where flexibility is required. So good luck trying to make sense of it!

Adding extra internal functions and extension library functions is pretty easy though.
Title: Re: Improvements in Programming Languages
Post by: Eros Olmi on May 18, 2007, 02:57:15 PM
but I think the only way I am going to grasp the subject is to come up with a system I would enjoy using. So I have ignored much of the c++ stuff.
I like that. I would have never created thinBasic without the pleasure to experiment and follow personal idea.

My source code has very few comments, and some of the parsing logic is quite complex. It partially tokenises
and links the script at load time, but some of the linking happens later during run-time, and some of the script
is left untokenised where flexibility is required. So good luck trying to make sense of it!
If the links you gave here will remain valid I will follow your improvements with interest. My target is not to understand full details but get the main design. Always interested in parsing, tokenizing, interpreting stuff, really. I'm finding your ideas brilliant, I will follow with interest even if I will not post.

Adding extra internal functions and extension library functions is pretty easy though.
Well, same here. Once there is a general method it is not so difficult. Problem is to find a general way that is so "open" and general to be valid for future implementations even if right now future implementations are not even in my thoughts.

Ciao
Eros
Title: Re: Improvements in Programming Languages
Post by: Charles Pegge on May 19, 2007, 08:44:06 AM
Passing a Function as an Argument

do_many( 3, "greeting()" )
end

function greeting()
$ hello!
end function

function do_many( i, f )
(
 exec f
 if --i GT 0 then repeat
}
end function


result:
 hello!
 hello!
 hello!



A Program to Write a Program

new fi=output("hello.pro"); out(fi,"")
$ function greeting()
$ $ ------------
$ $ Hello World.
$ $ ------------
$ end function
close(fi)

$ execute hello
load "hello"; greeting();unload
$ done hello

Result:
 execute hello program
 -----------
 Hello World.
 -----------
 done hello


Passing a Program as an Argument

do_many(2,"load 'hello'; greeting(); unload")

Result
 -----------
 Hello World.
 -----------
 -----------
 Hello World.
 -----------



...
Title: Re: Improvements in Programming Languages
Post by: Theo Gottwald on May 19, 2007, 10:19:43 AM
I generally like new revolutionary ideas in programming.

There are finally two directions:
a) easy to use languages which follow the way humans are thinking and working
b) machine oriented languages

The machine oriented languages have much shorter programms.
As a beginner you may have trouble to read them.
As a Pro, you save time compared to the human oriented languages.

I remember the time, when I was programming in FORTH.
I really liked the idea with the stack, the UPN etc.
FORTH is clearly machine oriented.
It is still the language which solves problems with the shortest code.
Besides APL.

Most people don't know APL. APL (for those who know it) is not only a programing language, it is a mathematical way of describing algorhytms or problems. It will even improve the way a student is thinking. Thats why APL is really a good recommendation for scientific students.
Many commands are just 1 character long, because APL has an own keyboard-Layout with  (mostly mathematical) functions. APL was my favourite besides FORTH.
Because the code was really short.

To type a mathematical Sollution in APL  or a program in FORTH just saves time.
You don't need to type that much. While it may be harder for someone else to read it later.

Some of the APL Elements can be found in Euphoria.

Writing in FORTH, the programmer needs to have a paper near the keyboard.
He needs to write down "whats on the stack" actually. I doubt this could get popular these days.
While it has advantages.

Some of the APL Elements can be found in Euphoria.
Euphoria and APL have an interesting way to remove inner Loops. See ->
See http://www.rapideuphoria.com/.

You just give an array to a command. And most commands just work on Arrays or parts of an array.
This means: You just don't need a Loop-Statement.

From my point of view, just looking at languages like C or Basic is not enough.
Because the differences dissapear with time. What left are just a bit diffrent syntax definitions.

To get a better overview what concepts are out there, a closer look on really different concepts is recommended.

Here are some suggestions:
APL (very short code!),LISP, PERL, FORTH, EUPHORIA, SCHEME, D++ (http://www.digitalmars.com/d/),EIFFEL and of course there are some missing.

Where i can absolutely agree to what you write is, that the compilers should use the power of the new cpu's to get more intelligent.

For example:

If I write:

FQR a=0 TO 10

The IDE CAN find out that this could be a FOR-statement and do what GOOGLE does with wrong input under worse conditions. Ask directly:

Do you mean "FOR a= ...."
Click ok, to correct the mistake.

There is a lot more about this.
Title: Re: Improvements in Programming Languages
Post by: Charles Pegge on May 19, 2007, 04:00:04 PM
Thanks Theo, if all these languages can steal each others best ideas, we will all have a much better programming experience.

I did some experiments once trying reverse polish notation (like Forth) in the script but ran into logistical problems, when it came to using optional or default parameters. The script internals resolve the parameters and function calls into a stack, or in my system a queue before execution.

I know little about APL except that it used a number of specialised symbols, wich made it easy to program, for the mathematically minded but difficult for others to read afterwards. But I knew some programmers in the 80s who worked for I P Sharp and thought it was the best thing ever invented.

The Functional languages you mentioned, Eiffel etc,  are interesting in that they are mathematically rigorous and will not allow a variable to be assigned a value more than once, and this helps to deliver error free code. They also allow functions to be passed as arguments to other functions, which is quite easy to adopt in a scripting language, but very troublesome for static compilation.

As I see it, computer languages are multilayered, with very fast but rigid processes at the lower levels, and slower but much more flexible processes above. It should be possible however to automatically compile static processes down to machine code to get the best of both speed and flexibility, within a single language.
To do this the run-time module must have a simple compiler at hand ready to perform JIT compilation, when needed.

The "no loops" way of operating on arrays of data is quite easy to implement: put the size of the array into its element 0. My script is half way there. The loop is syntactically minimal.

new a[10] ,i=1
{
  ... ;  if ++i LE a then repeat
}

but this could also be put into a function called iterate to give you a single liner.
Example multiplying each element by 100:

iterate(ref a, "a[ i]*=100")

...

function interate(a,f, i=1)
{
 exec f;  if ++i LE a then repeat
}
end function

OOPing the syntax

  a.iterate(' this[ i]*=100 ')

...

function .iterate(f, i=1)
{
 exec f;  if ++i LE this then repeat
}
end function
Title: Re: Improvements in Programming Languages
Post by: Marco Pontello on May 21, 2007, 03:01:39 PM
I find this article fitting:

The Curious Mind - Levine the Genius Language Designer (http://www.zafar.se/bkz/Articles/GeniusLanguageDesigner)

Quote
[...]
Would it be inappropriate to concot a version of this story called "Levine the Genius Language Designer"? The first problem in discussing language design is that we do not know the answer to that question. We do not know whether the language designers are geniuses, or we ordinary programmers are cripples. Generally speaking, we only know how bad our present programming language is when we finally overcome the psychological barriers and learn a new one. Our standards, in other words, are shifting ones -- a fact that has to be taken into full consideration in programming language design.

Bye!
Title: Re: Improvements in Programming Languages
Post by: Donald Darden on May 21, 2007, 10:05:22 PM
Hey Charles, Eros!

I just caught up with your exchange.  Great stuff.  If you want to post any code
or links, please use my download section that was just added.  Same offer for
anyone else interested in topics related to programming, development, and such.

Programming represents an extension of the algorythmic concept to the dimension of the digital machine.  We approach it with words, symbols, logic and mathematics primarily, but people are looking for other means of expression, such as audio and visual.  I believe the goal for some is to make the machines more like us, or to make it easier for us to converse with machines.

Not that I think that this is entirely do-able or even potentially useful.  Humans
are often at odds with each other over the meaning of things, and how things
should be interpreted.  I don't feel that I need to argue with my toaster on the
morning over how brown my toast should be, or which side it should be buttered on.

One thing presently lacking from most compiled languages, but I have found in some interpreter languages, is the ability to define and set variables, even to
enter formulas and mathematical expressions as part of the user interface.

For instance, if the program is currently running, and I enter a=1, then the
program would recognize "a" as a new variable, and set it equal to 1.  If I then
enter a=a+1, it would be equal to 2.  If I type ? a, then it would print 2.  I
could define functions, which would then be part of the langage by extension
and immediately useful.

One of the problems we have is the limitations of the ascii character set, which
has been mentioned.  It is hardly ideal for recognizing mathematical expressions
as commonly encountered in books on the subject.  These have to be transcribed into a form that we can handle with the available ascii symbols, and
yet we cannot support things like subscripts and superscripts, vertical relationships, and greek symbols.

The fact is, that mathematics is limited because it has only primitive ways of
expressing certain operations, and is unable to express others.  For instance,
you cannot show a way to take the integer, or fractional part of a value, or
to use the absolute value of something, or to take the log or a negative number,
or to introduce logical determination points as part of a single equation.  So
equations and functions have to be continuous over a given range.

Programming offers many new ways to examine things, and to attempt to model
data, but the inability of programming languages to adapt to the existing notation used with mathematics makes it harder for mathematicians to make
the step towards using computers.  Several mathematical languages exist to
deal with this, but I've never worked with them, so I am unsure if they deal
with another issue, which is making it easy to introduce new symbols and
operators.  AFAIK, new symbols have to be defined as names, such as the use
of SIN(), COS(), LOG(), ABS() are predefined functions, and PrintThis() might
be a user named function.  I don't know if any support the creation of new
symbols per se.

When we write a program, we generally define many new processes and use
existing built-in functions, possibly some library of functions, and likely define
some functions of our own.  The problem then is, that in essence, we are all
involved in individual and personalizing effort to extend the language into a
new area of expression or performance.  So, as a consequence, we all depart
from the underlying language whenever we create new functions, and that
may carry us into areas where others are left behind.

Libraries and shared code give us ways to try and join forces as we progress
in new areas, but this is at best a fragmented effort.  There is a lack of real
understanding of how the new functions work or what they do, what their
constraints are, how to employ them, where to find them, or even if they
even exist yet.

Once you get to a level where a language is extensible, and begin to extend it,
you are faced with a problem of managing the extensibility of the language.
What should go into expanding it, how should it be done, how to identify what
has been added, how to identify what is available for it, and so on.

The syntax of a language has a great importance in determining how user-friendly it is, how well we adapt to it, and how well we can express ourselves when we use it.  Making improvements in this area is always
beneficial.  One operator I've always felt that should be part of a language is
WHEN.  This would effectively be a callback function I suppose.  It simply
says that WHEN something happens, THEN something else should happen as a
result.  WHEN does not mean that something will happen, but if it should happen,
then you have anticipated it and have prepared a suitable response.  In the
real time world, programming is all about anticipation and consequences.
Title: Re: Improvements in Programming Languages
Post by: Marco Pontello on May 21, 2007, 11:42:58 PM
Several mathematical languages exist to
deal with this, but I've never worked with them, so I am unsure if they deal
with another issue, which is making it easy to introduce new symbols and
operators.
I too never used it, but Wolfram's Mathematica (http://www.wolfram.com/) seems to be the finest tool of this type.
I have seen some pretty amazing stuff down using it. Some examples & discussions about its features can be found here on Wolfram blog (http://blog.wolfram.com/).

Bye!
Title: Re: Improvements in Programming Languages
Post by: Donald Darden on May 22, 2007, 02:55:01 AM
I guess this topic would not be complete without some reference to some of the
available sites and software out there.

"MuPAD ('Multi Processing Algebra Data tool') originated in 1990, in the purest of research: work at the University of Paderborn for handling bulky data from investigations of group theoretical structure in non-linear systems. Paderborn's MuPAD Group developed it into an open-source, cross-platform algebra system, until, in 1998, funding pressure led the Group into a separate commercial launch by the newly-created SciFace Software. I think this offshoot can be judged as having gone well. SciFace has a long-standing distribution partnership with MacKichan Software in the USA, and a further vote of confidence from MacKichan's recent decision to drop its joint use of Maple, and use MuPAD as the sole computer algebra engine for its scientific typesetting range.

"MuPAD 3.0 has been launched for Microsoft Windows 95 to XP, and offers a large symbolic and numerical command set in a standard notebook format, with typeset formula output and a 'virtual camera' graphics viewer, Vcam. For programmers, MuPAD Pro contains a source-code debugger for troubleshooting user procedures, and advanced users can add 'dynamic modules', compiled run-time C/C++ applications. The picture is similar for other platforms, except for the older versions: 2.5.2 for MacOS X and 2.5.3 for Linux."

http://www.scientific-computing.com/scwjulaug04review_mupad_maple.html

http://mathforum.org/library/results.html?ed_topics=&levels=research&resource_types=software&topics=diffeq

http://archives.math.utk.edu/software/.msdos.directory.html
Title: Re: Improvements in Programming Languages
Post by: Charles Pegge on May 23, 2007, 12:22:01 PM
The WHEN statement

Donald talked about the need for a WHEN statement. That got me thinking.

As a solution to spatial and temporal sequences:

Consider the two electrical formulae:

e=i*r
p=i*e

to resolve all the values without knowing which is defined

{
  when i,r then e=i*r
  when e,r then i=e/r
  when e,i then r=e/i
 
  when v,i then p=i*v
  when p,i then v=p/i
  when p,v then i=p/v

  when not v,i,r,p then repeat // or until no further resolution
}

A smart interpreter will be able to rearrange the lines to optimise the execution time, according to what data is most frequently presented, and also when to give up trying!


Support for Parallel Processing


A temporal example

foundations.build()
{
  when foundations.done then slab.build()
  when slab.done then walls.build()
  when walls.done then floors.build()
  when floors.done then roof.build()
  when floors.done, walls.done windows.install(); walls.plaster()
  when walls.plaster.done, walls.paint()
  when not walls.paint.done, roof.done then repeat
}
when wall.paint.done, roof.done then house.done=1


special functions are required that initiate threads, and return immediately while allowing the thread to continue operating in the
background on the workspace provided by the object:

process .build()
  if this.busy or this.done then return
  this.busy=1
  ....
  this.done=1; this.busy=0
end process


...

Title: Re: Improvements in Programming Languages
Post by: Donald Darden on May 24, 2007, 11:17:13 AM
You grasp some of the possibilities of WHEN very quickly Charles.

WHEN and IF perform similar operations, but I had the idea that while running
through a stack of IF statements would not be the most efficient way to test for
every possibility, by linking WHEN to some external or predetermined factor, it might be possible to thread through the IF statements that are related, and ignore the intervening IF statements that do not involve the same precondition.

Certainly the idea of using WHEN [condition] THEN [what to do] should be more comprehensable to someone new to programming than CALLBACK FUNCTION would be.  But the condition could be either an external event, such as a sequence of keys on the keyboard, or a message from another process, or it could be an event within the program that is defined in the manner of a conditional test.

Anyway, it was just a thought, and my ideas of how it might work are just tentative in nature.  My thoughts keep going back to this, and it is my feeling that it offers a chance to have real world extensions added to the supporting language.
Title: Re: Improvements in Programming Languages
Post by: Charles Pegge on May 24, 2007, 01:37:01 PM

Well, polling flags in a loop is not the most efficient way of doing things but is is certainly flexible and can cope with multiple dependencies and trigger multiple actions. With a callback system, things can get quite complicated. I cant see an obvious way of using callbacks that could be applied to all situations.

In any case, most CPU time would be spent servicing the individual processes/threads. Its like a winmain message loop, and multiple flags can be aggregrated into single ones at various stages.

But a smart system, aware of process times and probable order of execution will be able to arrange processes adaptively so they execute smoothly. In a multicore system, (and the future of computing must surely be parallel processing), CPU intensive tasks can be allocated more resources, while simple but slow peripherals are left further down the list.

The difference between a IF and a WHEN, as I see it is that the system can
rearrange the WHEN statements with a block to gain the best advantage, though logically they are the same.


Title: Re: Improvements in Programming Languages
Post by: Donald Darden on May 25, 2007, 04:49:30 AM
Exactly right.  When you are dealing with IF clauses, you have to assume that the sequence of IF statements is somehow critical to the overall design of the program.  The order of IF statements must agree with the order specified within the source code.  WHEN statements could be considered independent and on a parity with other WHEN statements, and that rearranging the WHENs would not effect the general flow of the program processes.  The smart system could then attempt some way to optimize the testing or polling necessary to see if any WHEN condition is met.

Optimization could then be towards the most efficient method of testing to find any WHEN statements that might need to be satisfied.  It may also be possible to consider some WHEN conditionals to be of a lower priority or occurance rate than others.  For instance, if you are looking for a specific keystroke, you can take into account the typical length of time between key presses using the keyboard.    A general scheme to note when the last key was pressed and released may precondition a WHEN statement not to be tested for a tenth of a second or more.  An efficiency scheme could then attempt to create multiple timing or polling loops that adapt over time to include or exclude certain tests based on an evaluation scheme of their probability of occuring over a certain time lapse.  This optimization process could be transparent to the application programmer, but assures that the results will be optimized towards the greatist response rate possible within a given system's hardware, operating system, and running processes.

WHEN might even be used to wake up programs that are not currently executing in memory.  Most methods for waking up new programs seem to involve some scheduler program, and likely triggered by reaching a certain predetermined date and time.  WHEN might make it possible to preschedule program executions based on a time reference, or based on some other operational parameter.  WHEN might actually interface to existing schedulers, or have its own scheduler capability.  It might also be possible to universally examine all prescheduled WHEN programs and conditions through a separate interface related to the function of the scheduler.

Title: Re: Improvements in Programming Languages
Post by: Theo Gottwald on May 25, 2007, 07:57:52 AM
Reminds me of an old Pascal book from the university.
The If is called a "twi sided  choice" (translation) and the professor says:
"IF you omit the ELSE and you give only 1 Alternative, your programm may have a mistake".
Title: Re: Improvements in Programming Languages
Post by: Charles Pegge on May 25, 2007, 11:28:54 AM

When dealing with WHEN statements, I think we must gently but firmly say goodbye to ELSE. Once you start changing the order of execution, the logic has to be very simple and solid.

In any case, a lot of ELSEs in a program make the logic too complex to follow and unsafe to modify. I have used them a lot in BASIC and in the intricate business of parsing and interpreting code, ELSE clauses always cause trouble.

The CASE structure allows logic to be traced more easily, and altered without unforseen consequences.

{
  IF .. THEN .. EXIT
  IF .. THEN .. EXIT
 ...other alternatives
}

another way of doing this is with a subroutine instead of a block

IF .. THEN .. END
IF .. THEN .. END
 ..
... alternatives
END






Title: Re: Improvements in Programming Languages
Post by: Charles Pegge on May 30, 2007, 01:28:26 PM
Assembler Using High Level language Syntax

The uncompromised specificity and efficiency of assembler combined with the easy syntax of a high level language. The best of both.

Example:

' conversion of string to upper case

'High Level Code
'--------------------------------------------
i=1; l=len(s); c=l
{
  if c LE 0 then exit
  a=asc(s,i)
  if (a GT 96) then (a LT 123) then a-=32; mid(s,i)=chr(a)
  i++; c--; repeat
}


' High Level Assembler
'--------------------------------------------
' esi indexes the string
' ecx indicates how many character bytes to convert

eax=0; push esi; push ecx
{
  if ecx LE 0 then exit
  eax = byte [esi]
  if byte eax GT 96 then if byte eax LT 123 then eax -= byte 32; [esi]= byte eax
  esi++; ecx--; repeat
}
pop ecx; pop esi


'---------------------------------------
'Low Level Assembler
'---------------------------------------
!  mov eax,0
!  push esi
!  push ecx
repeats:
!  cmp ecx,0
!  jle exits
!  mov al,[esi]
!  cmp  al,96
!  jle  iterinc
!  cmp  al,123
!  jge  iterinc
!  sub al,32
!  mov [esi],al
iterinc:
!  inc esi
!  dec ecx
!  jmp repeats
exits:
!  pop ecx
!  pop esi
'----------------------------------
Title: Re: Improvements in Programming Languages
Post by: Theo Gottwald on June 01, 2007, 01:48:13 PM
Modern compilers have already reached the ability do do quite good optimization on simple code.

The question is similar to the Chessmaster-question:
When will compiler be able to produce faster code from algorhytm then a human ASM-programmer?

You may say "Never", but take a look at chess-history.

The computer has the advantage, that he is able to calculate latencies, processing-times, averages and memory-waiting-cycles much more accurate then a human programmer.

In the near future it may be like in chess:
There is a compiler thinkable that makes faster code then a human programmer.

Looking at compiler-technology in terms of optimizations, the Intel C-Compiler looks to me ahead of competition, but even then is far from beeing a "Compiler-Chessmaster".
Title: Re: Improvements in Programming Languages
Post by: Marco Pontello on June 01, 2007, 02:38:52 PM
Speaking of the limits of optimization, a thing that come to mind is also dynamic, runtime optimization. A whole new class of opportunities for optimizations open up considering the "live", running program. Certain VM can do pretty remarkable things, in some situations, in this regard.

Check this:
HP Dynamo - Transparent Dynamic Optimization (http://www.hpl.hp.com/techreports/1999/HPL-1999-77.html)
Quote
Dynamic optimization refers to the runtime optimization of a native program binary. This paper describes the design and implementation of Dynamo, a prototype dynamic optimizer that is capable of optimizing a native program binary at runtime... Contrary to intuition, we demonstrate that it is possible to use a piece of software to improve the performance of a native, statically optimized program binary, while it is executing. Dynamo not only speeds up real application programs, its performance improvement is often quite significant. For example, the performance of many +O2 optimized SPECint95 binaries running under Dynamo is comparable to the performance of their +O4 optimized version running without Dynamo.

Bye!
Title: Re: Improvements in Programming Languages
Post by: Theo Gottwald on June 01, 2007, 04:28:18 PM
And thinking in terms of "Playing Chess" even these "dynamic Runtime Optimizations" are just the beginnings of what is possible.

Actually these optimizations are for aspecial architecture mostly.
A next step could be to simulate diffrent available processor-architectures and take the best code combinations.

But its like playing, once you need to play stronger, you develope better strategies.
Simulating the Opponent is long time normal in Chess. But they do more.
They even give numeric weight diffrent combinations in the game and then they try to greater depth of changes.
The compilers have just started to learn "playing", which is indeed diffrent from just "statically" replacing Text-Combinations with ASM Combinations - more or less out of predefined tables.
Title: Re: Improvements in Programming Languages
Post by: Charles Pegge on June 01, 2007, 10:50:17 PM
One optimisation that is quite hard fir programmers to do but much easier for the system is working out parallel paths of execution. The x86 already has register level parallel processing and branch prediction in conjunction with shadow register sets. SIMD extensions allow arrays of registers to be processed on a single instruction. The operating system can allocate threads to several processor cores. Google does its work over the internet with massively parallel computer networks.

When we start using the full 64 bit mode on the x86's the most useful feature will be the extra registers,  so there is less need to push your pawns onto the stack for passing parameters to a function.

But the hardware guys seem to be well ahead of the software guys when it comes to generating computer power.
Title: Re: Improvements in Programming Languages
Post by: Donald Darden on June 02, 2007, 08:28:48 PM
Optimization is an elusive goal, because while you can refine code to work better in terms of implementing a specific algorythm, it may be that a different algorythm would serve better overall.  For instance, which would normally be more effective, converting all characters to upper case, all characters to lower case, or just deal with each character in the case that you find it?  If you assume that you are working with standard text, then you might also assume that the vast majority of your characters are already in lower case, so less time is required to change all characters to lower case.  But if you were examining a body of code, you might find that all your key words are already in upper case, so in searching for key words, it might be best to treat each letter in the case that you find it.

Optimization by machine can only compare different approaches to doing the same thing, and determining which method is most efficient.  But a problem immediately arrises, because the machine has to discern what you are attempting to accomplish, and this is really beyond its powers. A programmer who prides himself on writing optimized processes may expect you to try certain things, then design its optimization methods to substitute one obvious way that you might adopt with a less evident way that is actually better.  But that would be one programmer attempting to enhance the work of another, not something that the computer would undertake on its own.

While 64 bit programming may introduce additional addresses and more instructions, the downside is that the OS will still just be giving you a sandbox to play in, forcing you to learn how to play nicely with other running programs and processes.  You will find that you still have to save registers in order to free them up for your own use, and to restore them afterwards.  It's like learning to drive on a two lane road, then suddenly having to travel through the heart of a city on a roadway twelve lanes across.  You don't have the freedom of having all those lanes to yourself, you have to allow for other vehicles whizzing around you as well.

What you might hope for is an expanded set of instructions that show some insight by the designers as to what really needs to be done in software, and better tools for that purpose.  But then there is always the question of legacy support for the older architectures, and whether you want your code to work on existing 32-bit and 15-bit machines or not.  You may be forced to forego the use of really advanced features, or you may not even be able to access them because your compiler/assembler may not include that support, or you may not have any  supporting documentation to describe them or how to access and use them.

It's generally understood that software development lags hardware design by at least five years.  The hardware guys make it and give it to the software guys, and then the software guys struggle to figure out what it is good for, and how to get the most out of the design. 

We've all seen or read claims that DirectX 10 will change game development,
and then maybe a year later, some titles that use DirectX 10 will begin to show up on store shelves.  At the same time, few video cards are able to support
DirectX 10 yet either.  So how does this fit in?  Well, even in the hot and heavy
game development market, it takes time to take advantage of new technology.
The pressure to bring new games to market is emmense, with major bucks involved, so this is just a super paced example that proves the point.

But hardware is not the only thing that evolves, and software does not limit its rate and growth of development to changes in the hardware.  New languages and tools are always appearing, new books appear to explain them to us, and new skillsets are expected of us, sometimes almost overnight.  I recall one job posting that wanted five or more years experience in a new language that had only become known commercially the year before.

The fact is, if you took all the possible languages, libraries, tools, and everything else now available to the programming community and sturred them all together in their many thousands, then cut a narrow slice to represent your ability to know and have experience some of them, then what are the chances that your narrow sliver will exactly coincide with another sliver that represents the job skills and experience being requested by a job posting somewhere?

This is sometimes the advantage of the independent developer.  He can only bring to the job the things he (or she) had experience with, so the job, whatever it is, will be defined in those terms.  If you end up having to be replaced on the job, the likelihood is that the search will be for someone with your same qualifications.  Again, the improbability is that another person exists with exactly the same background that you have.

These are just observations that I've made.  I've also noted that we often do not choose the tool best suited to the job, but best known to the programmer or to the person identifying the requirements for the job.  We realize that the time and effort to retrain and get up to speed is prohibitively costly, and needs to be avoided wherever possible.  So demands for specific skill sets in the right combination with each other will continue.  And some combinations will be of greater value, and in greater demand, than others.  It can also happen that the more identified you are to a certain type of job or position, the less well suited you may seem for other jobs or positions. 
Title: Wild stuff
Post by: Kent Sarikaya on November 25, 2007, 03:30:35 AM
Chances are anyone following this thread would be interested in this wild topic about functional languages and their power.

http://channel9.msdn.com/Showpost.aspx?postid=358968

Even a dummy like me, could almost understand it all. Well I could follow along while he presented the topic, but in no way could I work my way through it on my own :)

I still don't see how it simplifies complexity, it just seems you could do similar things in any programming method, which he shows via c#.

Title: More Wild Stuff on Functional Programming
Post by: Charles Pegge on November 26, 2007, 12:01:11 AM
Thanks Kent, I managed to get about half way through this video, before my brain went fuzzy. He is trying to convey some important principles in functional programming but the puzzle is how to apply them in a real project.

So I've been looking for material that relates functional programming ideas to the world of computer graphics and other complex areas of application. This is the best I've seen so far.

Tangible Functional Programming

'We present a user-friendly approach to unifying program creation and execution, based on a notion of "tangible values" (TVs), which are visual and interactive ...'

http://www.youtube.com/watch?v=faJ8N0giqzw
Title: Re: Improvements in Programming Languages
Post by: Kent Sarikaya on November 26, 2007, 03:51:45 AM
Amazing follow up video to the previous one. I guess they are dynamic duo of videos to a whole new world of thinking and working.
They definitely go hand in hand I think really well.
Title: Re: Improvements in Programming Languages: esoteric stuff
Post by: Charles Pegge on November 28, 2007, 04:36:12 PM
Parametric polymorphism

This will give you a flavour of how Functional programming theorists think. The speaker deals engagingly with a very abstract subject, and elucidates the mathematical and philosophical origins of Type theory and polymorphism  Starting with Gottlob Frege over a century ago through to the present day, with languages like ML Ocaml and Haskell.

In the Principia Mathematica, attempting to prove the validity of mathematical concepts, Russell and Whitehead took 400 pages to prove that 1+1=2. But this was done without the assistance of a computer. :)

I am sure I have not grasped the full significance of this talk, but it's something to do with making program design more robust by eliminating ad hoc design decisions.

http://video.google.com/videoplay?docid=-4851250372422374791

Title: Re: Improvements in Programming Languages
Post by: Kent Sarikaya on November 28, 2007, 06:24:55 PM
Thanks Charles, I had seen that video before, but it was way beyond what I could grasp. But now seeing the other 2 videos, I can understand the gist of it better, but it is still way beyond me. Right now I see it more as a programming flow diagram, looking at these videos, and very hard to put into actual usable code that I can follow :)
Title: Re: Improvements in Programming Languages
Post by: Charles Pegge on November 28, 2007, 07:20:54 PM
This stuff comes from a notion of pure mathematics. As I see it maths is all about patterns, no more no less, so I am resolved not to be intimidated or bamboozled by obscure language. It would help if these guys used more evocative words and some practical examples, then many more people would understand what they are talking about,.

The sense I get from these talks is that functions should be simple, unbreakable and as compatible as possible so that they can be used in any combination as long as the parameters are of the correct type.

The functions in functional languages have the additional feature of being able to receive and return other functions as though they were variables, giving another dimension of flexibility.
Title: Re: Improvements in Programming Languages
Post by: Kent Sarikaya on November 29, 2007, 06:32:24 AM
Wow your description really encapsulates in a clear fashion of what was going on in those videos. What a great way to explain it!!
Once you master this, you will need to be the one to bring it to the masses to understand it with nice easy examples to follow.

Good luck on the studies and adventures into this dizzying world!
Title: Re: Improvements in Programming Languages
Post by: Charles Pegge on November 29, 2007, 10:31:58 PM
We could try this functional approach in the 3D world. Here is a way of describing the surfaces of various shapes:

All these expressions equal 0 when any  point (x,y,z) lies on the surface of the shape. These seem to be brain teasers at first but are potentially very useful. Can you work out the expression for a corrugated surface?

Infinite horizontal plane

y

Sphere

x^2 + y^2 + z^2 - 1

These require more than 1 null expression to describe them

Cone

x^2 + z^2 - y
(y<0)
(y>1)


Cylinder

x^2 + z^2 -1
(y<0)
(y>1)

Title: Re: Improvements in Programming Languages
Post by: Donald Darden on June 09, 2008, 11:31:27 PM
I think it is safe to say that language is the tool that give us the mental lift to stand tall enough to perceive new ground for our thoughts.  We postulate what we can see as a new approach, term it, then teach the concept and the terms until we are familiar with both, and look for situations where this new approach seems to bear new fruit.  If we find any, we consider this validation of the whole process.

During a certain period of art, figures in paintings were sized according to their perceived importance.  When perspective was introduced in a painting that appeared to be a mirror reflection or a town square, it took the art world by storm.

With the advent of digital circuits, we had to begin thinking in terms of binary
arithmetic and logic, of absolutes rather than vague quantities such as "few" or "many". or "most probably" or "least likely".  Yet at the same time. we progressed from "things" to the concept of "information" about things as beng equally valid.
Do originals really have some intrinsic value of their own that make them worth
what some people will pay for them, or is there some illusion involved that helps distinguish the original from mere duplicates?

Some computer languages strive to go ever higher in form, separating themselves from the mundane of actual computer hardware and how it works,
while other languages strive to bring us closer to dealing with how computer circuits actually work so that we can achieve greater perfection in planned performance.  Many languages strive to be more "natural", by which we mean
they reflect the way we've learned to talk and think and conceptualize things,
and the hope of some is that language will eventually lead us to the point where
we cannot distinguish man from machine.

These are interesting ideas and notions, and there is no doubt that we are making inroads on many of the problems of creating machines that we can communicate with, and with the way we communicate with them, and they with us.  But that does not signify that we can really elevate machines to the same plane that we have arrived at, partly because we do not really know how we got here, and also because being human and subject to mankind's limitations are not reasonable design goals for creating new machines.  It is too easy to acquire
humans directly if that is what you ultimately want.