famicomaster2
Experienced Member
To my knowledge, by the time this problem had been discovered, modem standards had basically all converged under the ITU, excluding a few high speed protocols that were available in the states. It is also my understanding that around WWII most of the world's telephone systems were modified (or had converters to interoperate with) for the Bell system to make international calling easier. Many other countries already had existing Bell networks anyways.However I would think that there must had taken place some work on modem standards elsewhere too.
The "problem" is that when these standards were being put in black and white for equipment to be made, long distance trunks used FDM, like L carrier, to transmit multiple calls on a line. This means you have rather close tolerances (and often crosstalk!) for your frequency band. The Bell system already included filters in phones, so it was a reasonable expectation to just filter out what you wanted for the trunk anyways. The design was intended for I believe 300-4000hz or so. Thus, lines were always designed for that frequency band, and when digital trunks came around with T carrier by the late 60s, the DACs were built to sample at the almquist frequency to help with clipping.Sure, Europe was way more split with different countries having different telecom monopolies, and later different competing telecom companies, using different hardware suppliers and whatnot. But I would think that many of these would had used better "sound" quality ADCs than those you mention, simply due to the switch to digital taking place later on.
Besides, the carbon microphones in actual Bell phones (the only phones that were allowed to be connected to the network by law for the better part of a century) didn't have that kind of frequency response anyways.
I've heard tales of local calls in the 60s and 70s being unfiltered because the local exchange had a crossbar or something, but I'm doubtful that went anywhere meaningful. By the time modems started hitting these barriers in the late 80s it was already far too late for any major population to be able to reliably use a "clear line" like that, and so modems are simply not designed to operate in this way. The expectation has always been that they will be filtered, or have load coils, etc etc since it was always set in stone. The exception to this was leased lines but that's a whole other story.
I've never used a Patton SmartNode, but I know they're in direct competition with the Adtran Total Access, which you can get for absolutely no money. I've seen TA912s go for as low as $20 used on eBay. My setup included originally a TA924e 2G, and now a TA924e 3G, both of which have reliably passed 54666 on my local network (analog to PRI) and I've been able to get a peak of 49333 from a "real" 56K ISP through it externally. It has a lot of SIP options and it's really easy to configure, so that's the route I went. The third gen TA900e series has a gigabit port and four PRIs, network or user. If you want something smaller, I think the TA600 series can probably do all the same stuff for potentially even lower cost.Patton also makes a couple series of ATAs - they call them SmartNode. If the grandstream doesn't produce favorable results I might give one of them a try. They are in a higher price tier which 1) sucks but 2) means they might be better?
Unlike the offerings from cisco and grandstream they also have "modern" models with gig-e too.
Yes, USR/3Com never even tried to implement it, and Patton eventually disabled it. As far to my knowledge, all "real" v.92 modems support PCM uplink, but I've never actually seen or heard a connection made with it. One bit sent during the ans section of the call determines whether PCM uplink is available or not. If the RAS (or whatever digital modem you call) doesn't have this bit set, you don't get PCM uplink, they don't even try for it.I assume you mean they never implemented it/disabled in their RAS?
As far to my knowledge, PCM uplink is available and enabled in most actual v.92 modems. This might be an erroneous assumption on my part, my sample size is only a few dozen. I don't really see any reason they shouldn't be able to PCM to each other at 48K, but my understanding is that if neither modem claims to be digital, they will always drop PCM as an option immediately and go for v.34bis Trellis code, regardless of line quality or other capabilities.What about connecting two v.92 modems together on a local analog PBX? Was the PCM uplink feature deprecated in the modems as well or just the RAS?