Home  ›  News  ›

Study Shows Carriers Often Over- and Undercharge for Data

Article Comments  

all discussions

show all 9 replies

And?

whatwhatjbdo

Sep 16, 2012, 10:02 AM
Doesnt every business account for error? Show us not just that there is a problem, but how to fix it.

Try this: Does a certain coffee vender that just so happens to cover the nation in its drink bliss at an average of 4 - 5 dollars a cup "overcharge" it's customers for the drink experience?

This is going to happen in every industry. My Tshirt doesnt have the exact thread count. My car tires do not have the perfect amount of rubber. My CPU has a ~-5 to 5% overclock ability. The Phonescoop articles have a chance to error on the words used on the site (granted, they are fixed quickly and I dont pay to read the news)

Well folks, there you have it. If you want perfection, we need Phonescoop to make us a cell network. Im in! 😎
Love ...
(continues)
...
T Bone

Sep 16, 2012, 11:05 AM
That's true but the difference is that a lot of people are being charged when they aren't supposed to be...and unfortunately, the billing rules are not exactly 'nice'...and the way the billing works makes it worse....at $2 a MB customers who use only 1KB by accident get charged for a full megabyte...Of course, these small data charges are not really a big deal, all they have to do is call customer service and get it waived....but people do get tired of calling customer service every single month.....I don't really know what the solution to this is....
...
LegitSmushmortion

Sep 16, 2012, 11:14 AM
Cell phone use is contractual and a monthly bill. Getting overcharged for something you simply aren't using is not the same as buying over priced coffee.
...
T Bone

Sep 16, 2012, 12:24 PM
It's not the same no, he's not saying that it is exactly the same, he's saying that when you're counting massive amounts of 'stuff' that it is unreasonable to expect that the count will be 100% accurate and that a certain margin of error is expected...and he's right about that...
...
mycool

Sep 16, 2012, 10:31 PM
T Bone said:
It's not the same no, he's not saying that it is exactly the same, he's saying that when you're counting massive amounts of 'stuff' that it is unreasonable to expect that the count will be 100% accurate and that a certain margin of error is expected...and he's right about that...


That's how I read it. Though, I agree that the analogy is still poor because when it comes to computers doing what they should be doing (computating) these errors should not exist. It simply means that the software (or hardware's firmware) has bugs and it needs to be fixed.
...
T Bone

Sep 16, 2012, 11:01 PM
Well, we are talking about hundreds, maybe even thousands of Terrabytes of data,,,,even a computer can't count that much data perfectly....a margin of error around 5% is reasonable...although I don't want to one of those who gets overcharged.....
...
mycool

Sep 17, 2012, 11:09 AM
T Bone said:
Well, we are talking about hundreds, maybe even thousands of Terrabytes of data,,,,even a computer can't count that much data perfectly....a margin of error around 5% is reasonable...although I don't want to one of those who gets overcharged.....


Actually, no. In the computer world a 5% margin of error is just terrible... it falls under Class 0. The most common acceptable class is Class 4 which yields out a 0.001% margin of error.
...
srich27

Sep 18, 2012, 8:03 AM
Well, even a 0.001% error can lead to quite a few people getting overcharged. I didn't look very long, so I only grabbed a number from 2008, but back then AT&T had just shy of 100M subscribers. Even at only a 0.001% error rate, that's still 100,000 people with billing errors. Most of those are likely to be negligible, but I'm sure quite a few of those get quite messed up. Of course, that's assuming that only one computer handled all 100M subscriptions, and assuming they have no QA double-check before sending out bills. Neither of which are likely to be true.
...
mycool

Sep 26, 2012, 1:00 AM
srich27 said:
Well, even a 0.001% error can lead to quite a few people getting overcharged. I didn't look very long, so I only grabbed a number from 2008, but back then AT&T had just shy of 100M subscribers. Even at only a 0.001% error rate, that's still 100,000 people with billing errors. Most of those are likely to be negligible, but I'm sure quite a few of those get quite messed up. Of course, that's assuming that only one computer handled all 100M subscriptions, and assuming they have no QA double-check before sending out bills. Neither of which are likely to be true.


1. 0.001% of 100M is 1000 (not 100,000).
2. I was inferring on the data error, not the number of subscribers impacted.

Basicall...
(continues)
...
srich27

Sep 26, 2012, 7:23 AM
No?
0.001% of 100,000,000 = 100,000
Move the decimal 3 places, or double check with a calculator (I just did to make sure)

0.001% of 1,000,000 = 1000

Otherwise your point is perfectly valid. 🙂 No one would notice or even care about a 10KB error.
...

This forum is closed.

Please log in to report a message to the moderator.

This forum is closed.


all discussions

Subscribe to Phone Scoop News with RSS Follow @phonescoop on Threads Follow @phonescoop on Mastodon Phone Scoop on Facebook Follow on Instagram

 

Playwire

All content Copyright 2001-2024 Phone Factor, LLC. All Rights Reserved.
Content on this site may not be copied or republished without formal permission.