Author Topic: Problem #3: Development Software  (Read 4939 times)

0 Members and 1 Guest are viewing this topic.

Offline Donald Darden

  • Sr. Member
  • ****
  • Posts: 363
  • User-Rate: +3/-13
Problem #3: Development Software
« on: March 04, 2008, 11:40:36 PM »
This is not really a problem for the average user, but is certainly a key issue when it comes to any developers.  I see a lot of people argue that there is a market for a Linux version of PowerBasic, but I don't believe that there is any real opportunity there.  Let's face it, a big shop can pick and choose it's environment where cost is not the key factor, but a small shop is going to look for the least costly way of doing things.  The trouble is, if a shop adopts Linux, then it is hoping for a free ride by using Open Source as much as possible, and they are not going to pay big bucks for custom software development.

It is often a matter of perception and relationships.  In my first incarnation as a computer programmer/technician, I worked in a major data center that paid the princely sum of $186 per hour just for the power to keep the center operating.  The computers costs millions of dollars, and we had a maintenance staff of about 30 people just to keep things humming.  Compared to that, software development costs were minimal.  Now, you can spend under $1000 to buy a PC that has more power, speed, and storage than that whole data center had, and the only thing that keeps software costs down is mass duplication and market volume.  So to make that work, you need to design software that appeals to the mass audience.  That really puts a crimp in the independent developer's game.

But if you stick to Windows, you not only appeal to the larger audience, you also are approaching the crowd that is accustomed to paying for what they acquire.  Sure, there are many freebies for Windows, but the really good stuff often comes at a price.  So you have a group that is prepared to accept the fact that anything good from you is going to cost them as well.

Now consider that the average Linux user is either escaping from the perceived tyranny imposed by M$ or Apple, or is a maverick interested in doing his own thing.
A small business might consider Linux and Open Source as viable alternatives to managing everything, but some key person has to take leadership to make the decision and take the gamble.  It could be that they've decided that the money not paid in licenses can now go into custom development, but I think it more likely that they will want to search for the cheap way out, and that means trying to adapt what is already out there if they can.

Which is pretty much why I don't think there is a much chance for developers who that want to get paid to find many opportunities in the Linux and Open Source communities.

But rather than wait (possibly vainly) for there ever to be a PowerBasic version for Linux, which has already being called PB/Linux, why not look for some evidence that some cross platform development tool that already exists is opening doors of opportunity in the Linux community.  You have everything from Assembler, C, C++, Python, Fortran, and numerous dialects of BASIC and scripting languages available to choose from.  Some give you the opportunity of designating your targeted OS, and others attempt to give you similar capabilities under any supported OS. 

The thing is, that PowerBasic has done a great deal of work to continue to enhance its products by adding more and more capabilities that can be executed directly from within it, rather than forcing programmers to adopt and master other methods of expanding their abilities.  It has made a lot of budding programmers into drones, accustomed to being hand fed rather than learning to fend for themselves.  Sure, you work hard, but you have been spared the effort of casting around for some idea of what might be out there that needs to be learned next.  Instead, you study PowerBasic and play with the snippets, and exchange views via the forums, and you live a very sheltered existence.

So naturally there are a lot of people that want to hang onto PowerBasic, or continue to hold to some other aspect of Windows, such as the many games that will only run in that environment, and can't bring themselves to part company with these things.
Virtual Machine software and PCs powerful enough to make VM performance acceptable may be one way to resolve this, but I think it is important to recognize that moving to another OS is more about what you are giving up, at least initially, than what you might expect to gain.

 

Offline Edwin Knoppert

  • Sr. Member
  • ****
  • Posts: 254
  • User-Rate: +11/-4
    • Hellobasic.com
Re: Problem #3: Development Software
« Reply #1 on: March 05, 2008, 11:06:24 AM »
Donald,

Sometimes i am just to stupid to understand new things like Linux..

I while ago i tried to make it work and had some real issue with screen resolution and fixing it via the config.
This is one of the major unfriendly 'flaws' of current linux packages.

I wonder if you can setup a linux environment (any) with .NET support using the mono stuff.
If you would like to investigate i would suggest using MS VPC so the hdx files can be shared.

I think you will help a lot of people this way + since you are pretty thorough maybe you enlighten us with simple how-to's?

The linux distro's are still very unfriendly and for a Windows user like me not very understandable.
For example file extentions, it's all different..
It confuses me.

To edit the config file to fix the 'common' resolution issue (due VPC afaik) you'll need a degree to be able to fix that.

www.mono-project.com/

Offline Donald Darden

  • Sr. Member
  • ****
  • Posts: 363
  • User-Rate: +3/-13
Re: Problem #3: Development Software
« Reply #2 on: March 05, 2008, 07:41:02 PM »
In Ubuntu, fixing the screen resolution is very easy., because the menu system adopted by Ubuntu is straight-foward.  On the left side of the top toolbar, you find System, and under System you find Preferences, and under Preferences you find Screen Resolution.  We normally indicate this by writing System/Preferences/Screen Resolution.  Like Windows, you get to try out a resolution first.  Unlike Windows, you can also set the scan rate for your monitor.  This can be important with CRT monitors, since it controls the width and position of the screen image.  On my CRT, I found through several efforts that 800x600 at 82Hz is best for me.

You can manually make changes via configuration files, as you attempted to do, but that's like modifying INI files or changing the Registry in Windows.  It's usually best not to, unless you know what you are doing or are following detailed instructions.

The problem with Linux, in part, is that it is a smaller community and has fewer resources to draw upon.  But at the same time, if you know how to state your problem for a search engine, you can usually find useful information to guide you.

Take the matter of using a terminal console in Linux.  This is pretty much like using Start/Run under Windows, then typing cmd and hitting Enter.  That is not intuitive, yet it is something you learn to do.  So you learn that under Applications/Accessories, you have Terminal.  And when Terminal opens up, you will often type sudo -s and hit Enter to switch to the superuser mode.  Just something to remember.

People use Windows with no formal training at all.  That's because there is a lot of point and click involved.  But often they have to get someone to show then how to do certain things, like move files, or compose and send their first email, or how to get on to the internet.  You learn a bit here, you experiment a bit there, and as time goes on, it gets easier and seems more natural.

When you switch to Linux, you are repeating the same process, but now you have to compare what you can't do (yet) under Linux with what seems to natural and easy to do under Windows.  That doesn't make you stupid or slow to learn, it just means you can't get so impatient that you give up.  Believe me, you will learn Linux a lot faster than you first learned Windows, because there are many similarities and you already know most of the capabilities that you want to master.

As I said before, most things you are told to do in Linux are in terms of terminal console operations, what we commonly refer to as command line access.  Thjis is a curse to people that don't type, or have difficulty comprehending nongraphical information.  But it is precise, and easy to use, so even complex tasks can be reduced to a few statements.

The problem is, that if there are any decision points involved, a few statements may be inadequate because they do not tell you what decisions to make, or what options have to be considered.  It helps then to keep looking for other examples and choices made, because in this way you can become aware of those options and the influence they may have.

My posts attempt to perform a service, in that they are somewhat detailed, pointing to sites and postings that I found helpful, then give you the blow-by-blow results of my efforts.  Where I can, I try to give good sources.  But it is a trial-and-error process, and since my hardware is different from yours, and my needs and objectives are not the same as yours, then my results may be different as well.

How much detail do you need?  For instance, I could start off saying that you should get the current version of Ubuntu and burn the image to disk.  But if you have a slow internet connection, that might not be a real option for you.  You could probably buy a CD with it already on there and go that way instead.  So any discussion about downloading files vial HTTP or FTP, then using a software package like Nero or a free  CD burning program that can be had from many sources would be wasted, especially if you already know how to do these things, or don't currently have a CD burner that you can use.

Then I could tell you to either install Linuix in place of Windows, or set up another partiton and install it to that and elect to configure it for dual booting.  These are choices that you have to make, and the amount of drive space, your experience with repartitioning hard drives and installing new software all come into play.  I chose to use the NTLoader (Windows) for my primary boot process, but someone else might choose to use GRUB or a third party boot manager instead.  More than likely you will have to resort to an editor to make changes to boot.ini (windows) or menu.lst (GRUB) to get the boot process down exactly the way you want (cut the timeout from 30 seconds to perhaps 3 or 5, rename the boot link so that you will know what your boot choices are).

It's just not possible to be overly detailed, but if you bother to read through my many posts on this subject, I have tried to explain my choices, why I made them, and tell you enough to put you somewhere down the road towards doing it yourself.  If you are not a do-it-yourselfer, then this is possibly something you should not attempt.
I have scrapped projects or done them again and again in an effort to get things the way I want, so it is not always possible to just get there without a struggle.

I spent about 5 months at one point in my career creating a universal configuration for a PC that encompassed every task and position-specific application that an operational control center needed, and creating a menu system to boot into whichever mode was required at that point in time.  It was rough, because we are talking about 386s with 256 MB of RAM and 2 GB hard drives, and these had to encompass DOS, OS/2, Windows98, and Linux OSes as well as certain custom applications.  I would make a bit of process, have to create new images several times a day to capture the gains I made, and if the machine became unstable and crashed (a common occurance), I would have to go back to the last good image and resume my efforts.

I finally got it finished, and it was rock solid.  I then restored that image on every PC in the center, and from then on, operators could work from any position at any task as needed, and if a PC crashed, they could just move to another one and keep working.  Every PC served as a backup for every other PC, and I had one image for all (multiple copies of course).  We could no longer be impacted by the loss of one or two PCs, and people found that the new configuration allowed them great flexibility in where they sat and how they did their jobs.  I also only had to update one PC, make a new image, then use that image to restore every other PC to the same configuration.  This was much faster than updating each PC individually, especially since I could have several PCs being restored from backups at the same time.

But the thing is, that was an isolated case, where all the PCs were physically the same, and all the possible uses were understood and incorporated together.  That's certainly not the case today, and the closest you are going to come to that is to confine yourself to a small segment of the market.  For instance, instead of picking just any distro of Linux to use, you instead try to get one that is as mainstream as they come, and where you can find lots of supporting posts on related forums.  That just makes it easier for starters.  And it is a good reason for picking Ubuntu.