Go Back   Steve's Digicams Forums > Digital Cameras (Point and Shoot) > Hybrid Still/Movie/MP3 Digicams

Reply
 
Thread Tools Search this Thread
Old Jun 30, 2008, 12:50 AM   #11
Senior Member
 
Join Date: Dec 2007
Posts: 1,153
Default

Yes -- agree -- with Jazz and Aiptek and other hybrids -- the most important thing is that one enjoys shooting and enjoys sharing and enjoys the process of learning and creating.

For $99, that Jazz isn't a bad deal -- especially if the end game is a DVD. You're right; these HD hybrids can beat low and even mid-range MiniDV tape cassette format camcorders both in terms of quality and convenience.
Private Idaho is offline   Reply With Quote
Old Jun 30, 2008, 2:30 AM   #12
Senior Member
 
Wayne12's Avatar
 
Join Date: Jun 2006
Posts: 1,071
Default

Private Idaho, I think we are talking at odds here. I am referring to the disappointment that AVCHD did not live up to h264 expectations at the bit-rates, as ambarella's codec quality has. It was not a matter of if, but why not, and when. At last AVCHD has stopped being the underclass to consumer HDV, probably thanks in no small part to ambarella based cameras and sites that did not let them get a free ride at the quality AVCHD had been producing.
Wayne12 is offline   Reply With Quote
Old Jun 30, 2008, 3:09 AM   #13
Senior Member
 
Wayne12's Avatar
 
Join Date: Jun 2006
Posts: 1,071
Default

Rgvcam, I was not referring to your post when I said "we". I don't even bother reading their long contorted reviews spread over pages and pages, I just go to the relevant quality bits before going on. We were a bit shocked at the camcorderinfo Sanyo HD1 and 2 review over at dvinfo. It was obvious that as long as you could get much less codec problems then a HDTV signal it was not such an issue for HDTV work (as long as you shot it properly and maybe filtered out some of the codec problems in post). I managed to convince Chris over at dvinfo of it's merit, and now they have sub-forums for professional use of these cameras. As a consumer camcorder it wasn't so crash hot because of the, low light and latitude problems (which would have required them to light and learn camera handling to get around).

Do you remember how bad consumer camera quality was getting by the time HD consumer came by, how little latitude, how bad the low light, how blown out the colors. I was one of the ones pushing the issue of low light and quality at cc-info, and eventually they pushed it, and then the manufacturers picked up their game. Thanks to these efforts, we probably have much better quality equipment for consumers, that they can pick up and get descent quality results form without a big skill set. Without these sorts of efforts we might be faced with cartoon colored, blown out pictures, with black shows with high noise grain in them. Anything less than a well lit room might be little more than a joke. Now everybody enjoys the new industry mentality, and even $100 produce a lot better quality of your memories.

I noticed that they switched to different opinions for different users types down the bottom of their reviews. It is perfectly acceptable to tell prosumer/hobbyist group the way it is.

Anyway, I am getting belted up elsewhere (in general in life this weekend) so I can't spend too much time on this as I normally would.
Wayne12 is offline   Reply With Quote
Old Jun 30, 2008, 9:44 AM   #14
Senior Member
 
Join Date: Dec 2007
Posts: 1,153
Default

I understand your point about AVCHD; that the early models -- such as my Sony HDR-UX1 -- only supported relatively low bitrates.

That much is true, but CamcorderInfo obfuscated the fact that MPEG-2 HDV compression was really pretty outdated... not nearly as efficient as AVCHD... the only saving grace being that HD MPEG-2 was not as processor intensive.

And that brings me to why AVCHD was introduced at lower bitrates: the manufacturers probably did not wish to alienate videographers running relatively low-powered computers.

In fact, this is the key reason why -- in my view -- the hybrid manufacturers still cling to relatively low bitrates.

They're targeting people who are still using relatively weak computers.

To avoid a very negative user experience, they're not pumping up the data rates.

They're waiting until computer hardware -- specifically faster processors and jacked up video display cards -- can give consumers the smooth playback.


Private Idaho is offline   Reply With Quote
Old Jun 30, 2008, 1:35 PM   #15
Senior Member
 
Join Date: Jul 2006
Posts: 2,084
Default

Yes I have always assumed the low video bitrates were because most consumers tend to have relatively modest hardware. I happen to be one of them although at some point in the future I am intending to get a more powerful PC.

Wayne12, I do agree with you on the low light issue. My old JVC SVHS-C camcorder had great low light ability plus it also had a reasonable built in video light that worked for closeup work at a dance etc. The low light ability of the minidv and minidvd camcorders as well as hybrids now are not as good. The latest hybrids do seem to be getting better in that regard though. That little Jazzcam seems to have quite reasonable low light ability for a cheap cmos camcorder. It even seems to be less noisy than the Sanyo HD700 I tried out for a month or two.

rgvcam is offline   Reply With Quote
Old Jun 30, 2008, 2:51 PM   #16
Senior Member
 
Wayne12's Avatar
 
Join Date: Jun 2006
Posts: 1,071
Default

Private Idaho wrote:
Quote:
I understand your point about AVCHD; that the early models -- such as my Sony HDR-UX1 -- only supported relatively low bitrates.
It was how efficiently bit-rates were being used. Mpeg2 is a mature technology, so it was being taken advantage of efficiently. However h264 was not so mature, and not as efficient as it could be on consumer. To get power requirements down means working circuits less, and less complex circuits, meaning less processing and less efficiency. Ambarella was a radically design departure meant to do much more processing per unit of power. The problem was not the low bit-rates (though disappointing) but how they were being used. There was even Mpeg4 camera makers using mpeg2 encoding in a Mpeg4 container, selling the cameras as mpeg4. In the same way, you can do a level of encoding performance similar to mpeg4 and dress it up as h264 (after all H264 share many things).

If you run H264 Intra codec, you will use a lot more bit rate for quality but maybe lower processing power.

The reason they cling to lower bit-rates probably has a lot to do with implementation curve, power consumption etc, plus higher bit-rates challenge prosumer equipment (the new AVCHD based Panasonic camera uses a fuller bit rate). I would not be surprised if 9mb/s ambarella footage requires more precessing power than old 13mb/s cameras. In fact, if anybody has an Sanyo HD1000, test out the playback performance against a Aiptek in 720p60?


rgvcam

The cameras we have today are dramatically better than the ones around 4 years ago in low light, latitude, noise etc, believe me. Even though i have seen some poor low light examples in the latest Aiptek compared to the AHD model, the latitude seems better, and 4-8 stops off what we need ideally (rather than being just 4+ stops like we would see in the past). I can say that Casio F1 using the Sony chip seems to be 4 stops off where I would like to see it, but from just seeing very small sample of footage from the Sony SR12, it seems to be within the range and maybe 4 stops better than the Casio.
Wayne12 is offline   Reply With Quote
Old Jul 1, 2008, 1:23 AM   #17
Member
 
Rodfather's Avatar
 
Join Date: Jun 2006
Posts: 71
Default

I'm assuming one of the reasons for low bitrates on hybrids are to keep the files small for uploading straight to youtube.
h.264 is much more efficient but at this stage, a real pain to deal with.
It will get better once hardware h.264 decoding/encoding is taken care of with the graphics chip like how it is now with mpeg2.

Rodfather is offline   Reply With Quote
Old Jul 1, 2008, 8:24 AM   #18
Senior Member
 
Wayne12's Avatar
 
Join Date: Jun 2006
Posts: 1,071
Default

It is now as well. ATI has hardware for it, I assume that Nvidia also does by now. But you need programs that make use of it.
Wayne12 is offline   Reply With Quote
 
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off



All times are GMT -5. The time now is 12:44 AM.