A.D.A. Amiga Demoscene Archive

  Welcome guest! Please register a new account or log in

  

  

  

log in with SceneID

  

Demos Amiga Demoscene Archive Forum / Coding / Why dont sceners use NTSC??

 

Author Message
Cefa
Member
#1 - Posted: 22 Apr 2006 22:07
Reply Quote
I wondering why Amiga coders dont use the ntsc mode on the Amiga??
Begause many demos use the resolution 320x200 but in PAL mode that is not fullscreen so you have to center the screen in the middle. But 320x200 is fullscreen when you run it in NTSC mode. Plus NTSC mode is 60hz which means the amiga runs a little faster. Although i heard that the NTSC mode on PAL Amigas is not really NTSC but PAL/60 so that could give some problems with some projector screens. But most new projectors should be able to handle PAL/60 just fine i hope :)
rload
Member
#2 - Posted: 23 Apr 2006 04:45
Reply Quote
indeed..,. I remember some sonik clique prod crashing and burning at the gathering just because the projector couldn't handle it.. Since then I haven't dared to use it.
Vanquish
Member
#3 - Posted: 23 Apr 2006 13:31
Reply Quote
From my experience in converting all the screens for ADA, there are a lot of demos that do run in NTSC. I distinctly remember watching demos that flick into 60Hz because you see the TV flick and roll over for a frame or two as it re-syncs.

I have to be careful of this because I need to try and preserve the aspect ratio. As a standard TV is 4:3 ratio, a 320px wide display should be 240px tall to maintain a square pixel display. However, keeping the frame buffer at 200px tall and flicking the mode to 60Hz give the appearance of a square pixel display with less memory (because the frame buffer is smaller).

So, when I convert screens, I have to decide whether to compensate for this when scaling the image down to the screensize we have on ADA. Sometimes if I don't, the screen looks too flat - a good example is that any spheres look like eggs on their side!

Also, of interest, there have been a few demos where the screens we've grabbed have been 320x100. The demo runs in 60hz and skips every other scanline, thus faking a 320x200 display - basically, it gives you a psuedo full screen with only half of the frame buffer. I can't remember off hand but I'm sure one of the Virtual Dreams demos does this technique on some of it's rendered (CGI) sequences?

Taking the refresh issues from my games background, the 50Hz / 60Hz debate isn't always about speed, but about time. When you run at 50Hz you have more time each frame to calculate your routines before the next screen refresh.

Interesting topic! :)
xeron
Member
#4 - Posted: 23 Apr 2006 22:50
Reply Quote
320x200 centred on a PAL screen gives you slightly more CPU time per frame compared to NTSC... but I think the main reason is that most sceners live in PAL countries.
Cefa
Member
#5 - Posted: 27 Apr 2006 21:16
Reply Quote
Yeah Virtual Dreams they also did something with the screen on the demo "Love" when the transparend 3D blob scene is shown. I was only able to see that particular scene on a Commodore 1085S monitor. When i tried to run that demo on my multisync monitor i just got garbage :(
noname
Member
#6 - Posted: 28 Apr 2006 09:27
Reply Quote
Yes, the "Love" screenmode was evil.

Essence did a better job in the intro for "Virgill Dreams" which used 320x200 NTSC that infact scaled up to full vertical height on PAL monitors (atleast on most monitors - on some it didn't scale up to full size). Which means they had optical full screen but only had to calculate 4/5 of it, saving 20% CPU time.
Angry Retired Bastard
Member
#7 - Posted: 28 Apr 2006 13:24
Reply Quote
the pal->ntsc switch is kind is easily done by clearing bit #5 of $dff1dc.
"Virgill Dreams", "Invitation None" (the Sonik prod rload mentioned) and CnCd's "Killer" all used this kind of NTSC-setting. (Although Killer used a system-screen setup, so they might not have done the NTSC-forcing with direct bit-clearing..
Cefa
Member
#8 - Posted: 28 Apr 2006 20:00
Reply Quote
Just noticed when i run these NTSC demos mentioned here, that winuae does not support NTSC mode :) Pushing the tab button on the boot screen does nothing :(
winden
Member
#9 - Posted: 7 May 2006 11:23
Reply Quote
Nowdays we may end up having to go to NTSC, so that demos don't drop frames when used in projectors at big parties such as breakpoint, which only support 60hz refresh rate. C64 sceners, for example, were bitching a bit about this and i recall reading in a forum that "the beamer is making our one-frame scrollers drop frames due to HZ conversion"
Sir Garbagetruck
Member
#10 - Posted: 11 May 2006 11:41
Reply Quote
Because NTSC sucks total and complete ASS when it's the only thing you have to use, and all the demos are in PAL. Vblank timing completely effs up, etc etc etc. And you spend 3-4 years dealing with this, plus 2-3 years in combination doing NTSC fixes, and you just sort of go "ya know what... NTSC sucks total and complete ASS."

Tho I remember way back when talking with Dr. Skull about this, I think it was Dr. Skull, on irc, and he said he'd used NTSC for something in a VD demo... and I was shocked because I'd spent, what, 5 years dealing with NTSC and having to put up with it's total suckiness... and someone actually USED it for something. This of course reminded me that everything has a use. Including things that suck.

If companies aren't making PAL capable displays today, Europe should BITCH. Really LOUD.
krabob
Member
#11 - Posted: 11 May 2006 13:10
Reply Quote
" Everything has a use. Including things that suck."

OUR MOTTO !

... personnaly, I completely * stopped * using direct copperlists for screen opening since 1999, because of "things like that." NTSC is one issue, but for exemple If you have a workbench with DBLPAL,DBLNTSC, productivity, and other strange non-pal modes,( which is common with multisynchro commodore monitors), most demo will have a "wrong copperlist" once launch, and most of all, the original workbench's copperlist will be trashed when returning.
And once again, against the oldschool behaviour, I found a OS classic screen opening is the better solution. and notice the OS let you touch its copeprlist, so there must be ways for 100% system friendly copperbars !!!
winden
Member
#12 - Posted: 12 May 2006 14:24
Reply Quote
The problem is not PAL vs NTSC, but that projectors are initially designed for VGA signals and VGA signals are 60Hz by standard... what I wonder if is projectors don't have a composite o s-video input besides VGA...
doom
Member
#13 - Posted: 25 May 2006 00:06
Reply Quote
winden: Lots of (most?) projectors have composite and/or s-video inputs. AFAIK many even support PAL-RGB. Anyway, I'm not sure the problem at Breakpoint is the projector. I'm guessing it starts when the demos are prerecorded.

krabob: LoadView(null) to reset the display, then Forbid() to pwn the system.

truck: Few things suck as hard as NTSC. I wonder what the situation will be for Amigas when HDTV kicks in.

xeron: 20% is "slightly" more? :)

Cefa: 320x200 in PAL mode gives a nice widescreen effect so you can cover up the fact that your routines are slow as hell and claim it's a design thing. Also, we like (almost) square pixels. And more CPU time per frame. And we take pride in our European standards. And we must stay prepared for the oncoming retrovulotion.
krabob
Member
#14 - Posted: 29 May 2006 10:26 - Edited
Reply Quote
>LoadView(null) to reset the display, then Forbid() to pwn the system.

No, again: LoadView(null) will reset the copper, but the previous workbench copper state is lost if your were configured with DBLPAL,DBLNTSC, productivity, and other strange non-pal modes. It means, at returning, you will have a trashed uncoherent video mode on your system. At least, it is needed to:
-open a dummy pal classic intuition screen (just 1 line is enough)
-LoadView(null)
- at returning, kill the dummy screen, it will bring back the workbench screen in its correct state.

( and again, better : just use 100% system friendly intuition screen, it is not slower and open things to more configurations.)

> Forbid() to pwn the system.

This kind of thing was absolutely needed on A500, for better blitter hanging and sound stuff.... but on higher configuration, it appears to me as completely useless (and anti amiga: thing about other configuration as yours). bass.
winden
Member
#15 - Posted: 29 May 2006 13:03
Reply Quote
but you can just save the old view (stored near graphicsbase) and then on returning to system, do loadview(oldview)

and regarding forbid... yes it's not that needed nowdays... to ensure 100% cpu i think you should raise your task priority to 19 (just below input.device, which is 20)
krabob
Member
#16 - Posted: 29 May 2006 13:49 - Edited
Reply Quote
> and then on returning to system, do loadview(oldview)
* As far as I remember *, It does not work properly in the case I described, and most demos fail this test anyway, even very late aga demos.

I was looking at the "How to code" by comrade J:

http://www.mways.co.uk/amiga/howtocode/

...but I can't find where I've read about the open/close intuition screen trick. Anyway, the "how to code" were a perfect demonstration that copper hacks where hard to work everywhere; each new release told us to not do anymore what the previous was telling.
doom
Member
#17 - Posted: 1 Jun 2006 19:56
Reply Quote
The "how to code" document is really nasty. FFS it references Protracker of all programs as the proper way to open a non-system screen. Protracker is the most unstable application of all times, mostly because of the way it handles the display and inputs.

As for the display, what I do is:

- save the active view
- call Loadview(null)
- save current copperlist
- insert own copperlist

And to restore the display

- reload old copperlist (the one that shows a null view)
- reload old view

Has never ever failed. Not once. Except at one point the mouse pointer didn't reappear when loading the active view. I seem to have fixed that later on.

There are a lot of things you have to keep in mind to get it right, of course. For example, even though most people only use one, there are two copperlist address registers, and these by the copper itself. Also, you have to own the blitter and wait for the queue to empty before you go on assuming you have control of the hardware.

Most importantly, Forbid() is a must when doing any kind of hardware banging. And, disable interrupts and DMA as well. Not to ensure 100% CPU time but for compatibility (the OS is made to be in charge of interrupts and the copper, leaving it running while hardware-banging is unreliable).

Like I said, this has never failed me. Opening a blank screen is not a bad idea of course, I might do that in the future. But it won't make any difference to those using mode promoters.

As for the whole system vs. non-system thing, I consider code that runs on any Amiga running AmigaOS (not counting A1 which is an Amiga-branded generic PPC machine running an AOS clone) to be system friendly, even if it disables the system in a friendly manner. One of my favourite aspects of AmigaOS is that you can disable it when you want to. I couldn't imagine coding without complete hardware access. Programming yes, but not coding. I distinguish because I'm a retarded oldsk00ler.
krabob
Member
#18 - Posted: 2 Jun 2006 13:19
Reply Quote
krabob
Member
#19 - Posted: 2 Jun 2006 14:44
Reply Quote
> One of my favourite aspects of AmigaOS is that you can disable it when you want to.

I agree, it is a great and unique aspect of AmigaOS, amongst a lot of great aspect of this system.

> I distinguish because I'm a retarded oldsk00ler.

I disagree , You are not retarded: You succeeded in giving an intelligent answer to a flammer.

> I couldn't imagine coding without complete hardware access. Programming yes, but not coding.

Ah aha h you've been took into the amiga matrix !!! You thought you where aceessing direct memory while coding your asm , but it is false !!! your conscience is just the first point of a chain of interfaces that link your thought to a machine you know nothing about, no one could. Each point of the chain offer a new abstraction on the consequences of what you did. an assembler and an amiga could look like an understandable first interface to use, but it is not, because It is only the second point of a chain, not the last.
Finnaly, no one knows the shape of what is at the other side of the interface chain.
That's what comrade J tryed to find, but obviously, he failed.
Some people speak about a huge bullshit named god at the end of the chain, but me and bruce lee, we don't believe that. you were creating an amiga executable through a source; This is not at all the same thing as having "complete hardware access."; the interface are:
1. the assembler (may vary)
2. the exec source loader that handle unhandled code (may vary, trought 68040,68060 librarys).
3. the hardware configuration (may vary)
4. the graphic/CPU patch installed (may vary).
5. the fact that this whole system context could be emulated (may vary)
6. the fact that soon 99.9% of th execution of your routines will be done through an emulator as the original hardware got broken and rare.
7. the interfaces I forgot (may vary).
8. the interfaces to come (may vary).
...
12. the graphical result (may vary, even because of the way the partyprojector is tuned.)
13. the way images are decripted and understood by the people (vary vary vary and will vary.)
...
What i want to mean is: whatever you do, you're bound to an unique interface; you cannot control the shape of what is behind, but some good open interfaces let you enough access to do more: I mean to do hacks by using the sytem interface.
The other answer is to build your own interface, externalize it, and comment it, and then use it. this way, "the point can quit the chain".

Furthermore, Actually, when you make a demo, not only the last point of the chain you want to affect is not the
" hardware access", but also, the first is not your conscience, but your unconsciousness.
rload
Member
#20 - Posted: 2 Jun 2006 21:00
Reply Quote
Angry Retired Bastard who are you?
klipper
Member
#21 - Posted: 3 Jun 2006 19:46
Reply Quote
> I couldn't imagine coding without complete hardware access. Programming yes, but not coding. I distinguish because I'm a retarded oldsk00ler.


I totally agree with you guys and it's NOT just because we are oldsk00l stick-in-the-muds either...there's something else about Amiga coding...something deeper

I think the Amiga had the perfect balance between abstraction and direct but manageable control. You really felt as tho you were "touching the source" (as Krabob might say) with the Amiga...very spiritual experience indeed. :)

Unlike todays multilayered mazz of inconprehensible dlls and API's. I do not see how PC coders find joy in coding OpenGL/DX based 3D scenes fe. To me it is one step away from using a 3D/Modeller animation package :( PC = Totally souless programming
winden
Member
#22 - Posted: 3 Jun 2006 21:46
Reply Quote
I think that the real hotspot in nowdays pc scene is not on coding per-se, but on making a good demomaker for your artists. Mix that with this new boom-boom-boom-vj-style-demos craze and you can really see that for most remaining amisceners there is no need to leave. And even more, you can see people are slowly coming back!

ps. krabob, that deconstructionist post, even if a bit embarrassing in a "in fact there is no santa-claus" style, was a great read.

Disclaimer: yes, I know many people like boom-boom-boom demos, or like coding demotools, or anything else... this doesn't mean we have to feel it's wrong to like having control over the machine and the pixels and the sound samples.
doom
Member
#23 - Posted: 5 Jun 2006 22:32
Reply Quote
The never-ending layers of abstraction are precisely the reason I don't touch PC code anymore.

Also, that picture of me is obviously a fake. I never code in HLLs. ;)
Angry Retired Bastard
Member
#24 - Posted: 15 Jun 2006 14:04
Reply Quote
rload I used to do some amigacoding in the early-to-mid 90s and released a couple of productions (nothing really great though). Then I quit because of studies (and later work).

 

  Please register a new account or log in to comment

  

  

  

 

A.D.A. Amiga Demoscene Archive, Version 3.0