Solved large attachments in mail

December 3, 2019 at 11:49:53
Specs: Windows 10, 8GB
Hi,

I have the following question, which is not specific for either Thunderird, Outlook, or even any Webmail application.

Last week I had several reports of attachment which are too big, to be sent via a mail.

There can be an actual limit imposed per mail, for example a limit of 10 megabytes per mail.

I remember the time when internet speeds in general were a fraction what they were now, and in my mind, this was the reason why back then - the 90's - there were limits imposed on attachments.

I'm confused as to why exactly these same limits are still imposed, being the fact that the major cause of this problem has disappeared.

For example, you can download data at 20 megabytes per second, while back in 1995 that would have been 2 megabytes per second.

The second reason I can think of is storage ... but also here I'm thinking there is no issue at all. Cloud storage these days is measured in Giganbytes, not in Megabytes.

message edited by Looge


See More: large attachments in mail

Reply ↓  Report •

✔ Best Answer
December 6, 2019 at 08:49:21
> The first thing one would do, is split up the files, so that each split is lower than the limit. Thus, creating even more overhead than initially.

More overhead, yes, but no bottlenecks since each smaller email would need to wait it's turn.

However, I'm not so sure that your "first thing" assumption is true. Are you assuming that everyone is doing that now? Are you assuming that the split/reassemble process will even work for all recipients.

When your emails were rejected, how did you handle it? Did you "first thing" split the file and send multiple emails and ask your recipients to put the reports back together? Do you even know if they all know how to do that or if the system they use is even capable of doing that? If you split a report into multiple emails and I opened them on my smart phone, iPad or even a Chromebook, I wouldn't know how to reassemble the report. Maybe on my desktop I could get it done, but not everyone uses a desktop or even other types of full-blown computers anymore.

> If size is that important, why don't email programs compress by default ? Some files would be 1% of their initial size

Once again, it appears that you consider the internet, or at least "email", as a single entity. It's not. I have several email addresses with at least 3 different email service providers, from gmail to Microsoft to a tiny ISP headquartered in a upstate NY town with a population of 32K. I access my email on at least 5 different devices, each using a different access method. Some use a dedicated email client, others are web based. I'm thankful that enough basic email standards are in place that I can access all of my email addresses on all devices - most of the time. Even with the standards in place, things get funky sometimes and I have to reset things.

Now, imagine if we tossed compression into the mix. Trying to come with a compression standard that will work in every environment, on all platforms, for all users, might be quite a task. And if that standard is ever worked out, rolling it out to every email provider (at what cost to the provider?) would be another daunting task.

It's certainly not a bad idea. Perhaps they are working on that as we speak. Perhaps it's not a priority since if no one provides huge-file capability, then no one else is at risk of losing market share. Perhaps the reason it's not currently in place is because of how hard a task it is. Trust me, if Google or Microsoft thought they could become the single source of email services by providing huge-file capabilities - at both ends - then they would do it. Sure, they could say "Use our service. We'll let you send 1 Gb files!". That might work if every recipient that the sender connects with can accept 1Gb files. Within a single company, for example. If not, then providing a service that no one/few people can take advantage of is a waste of time and money.

message edited by DerbyDad03



#1
December 3, 2019 at 13:05:27
Cloud storage is not limitless nor free. Emails get saved in multiple locations as they make their way from server to server to server while en route to their destinations. Toss in the overhead that gets added to a email (about 1/3?) and you can see how that 10 MB file, enlarged by overhead and multiplied many times, has a much bigger impact on the infrastructure than just the original 10MB.

Then there's virus scanning and other security related overhead. The bigger the file, the longer the delay in processing. Bottlenecks become an issue, costing time and money to resolve.

Sure, you can download data at 200 MB/s, but can everyone, at one time, download data at 200MB/s? That would depend on the robustness of your network infrastructure. If not, we're back to talking about bottlenecks while Bobby, Susie, Fred, etc. download that full-length movie that Bill pirated and emailed to 100 of his closest friends.

A flaw in your reasoning (IMO) is that you've applied the vast improvements in data transmission and storage capabilities to a single use of that infrastructure. Along with the infrastructure improvements since the 90's came millions upon millions of additional users and devices as well as an amount of data that is awe inspiring and still growing. What's that commercial say? "There is more (personal) data on our cell phones than in our homes." A whole bunch of those infrastructure improvements are being used by data other than email attachments.

message edited by DerbyDad03


Reply ↓  Report •

#2
December 4, 2019 at 02:24:04
Ow yes, certainly, there is much much much heavier use of traffic. And yes, there is indeed some overhead involved, even up to 50%, possible.

But look at it technically: the movie you mention that is being download and spread via mail. That is 10 megabytes ... no, wait, that is at least 500 to 800 megs. In the days of DivX that was, and most were bigger. So, Ii'm saying 10 meg is a bit too low, and you mention attachments of what, 0.5 gig, 1 gig, 2 gig ? Surely that is a problem, I'm 100% sure. But the thing is : if it doesn't work this way, people will try and find other ways.

And that is my point : it doesn't work by limiting on a per mail basis. The first thing one would do, is split up the files, so that each split is lower than the limit. Thus, creating even more overhead than initially. What I don't understand is why a good working service needs to be crippled by such a stupid rule. If size is that important, why don't email programs compress by default ? Some files would be 1% of their initial size, if only ...
OK, some, like the movie you mentioned, will not.

message edited by Looge


Reply ↓  Report •

#3
December 4, 2019 at 07:42:10
What I don't understand is why a good working service needs to be crippled by such a stupid rule.

I think what your missing is Cost.

Not every system has the same resources, ie Money, to go out and get bigger, better servers.
They work with what they can afford.

MIKE

http://www.skeptic.com/


Reply ↓  Report •

Related Solutions

#4
December 6, 2019 at 08:49:21
✔ Best Answer
> The first thing one would do, is split up the files, so that each split is lower than the limit. Thus, creating even more overhead than initially.

More overhead, yes, but no bottlenecks since each smaller email would need to wait it's turn.

However, I'm not so sure that your "first thing" assumption is true. Are you assuming that everyone is doing that now? Are you assuming that the split/reassemble process will even work for all recipients.

When your emails were rejected, how did you handle it? Did you "first thing" split the file and send multiple emails and ask your recipients to put the reports back together? Do you even know if they all know how to do that or if the system they use is even capable of doing that? If you split a report into multiple emails and I opened them on my smart phone, iPad or even a Chromebook, I wouldn't know how to reassemble the report. Maybe on my desktop I could get it done, but not everyone uses a desktop or even other types of full-blown computers anymore.

> If size is that important, why don't email programs compress by default ? Some files would be 1% of their initial size

Once again, it appears that you consider the internet, or at least "email", as a single entity. It's not. I have several email addresses with at least 3 different email service providers, from gmail to Microsoft to a tiny ISP headquartered in a upstate NY town with a population of 32K. I access my email on at least 5 different devices, each using a different access method. Some use a dedicated email client, others are web based. I'm thankful that enough basic email standards are in place that I can access all of my email addresses on all devices - most of the time. Even with the standards in place, things get funky sometimes and I have to reset things.

Now, imagine if we tossed compression into the mix. Trying to come with a compression standard that will work in every environment, on all platforms, for all users, might be quite a task. And if that standard is ever worked out, rolling it out to every email provider (at what cost to the provider?) would be another daunting task.

It's certainly not a bad idea. Perhaps they are working on that as we speak. Perhaps it's not a priority since if no one provides huge-file capability, then no one else is at risk of losing market share. Perhaps the reason it's not currently in place is because of how hard a task it is. Trust me, if Google or Microsoft thought they could become the single source of email services by providing huge-file capabilities - at both ends - then they would do it. Sure, they could say "Use our service. We'll let you send 1 Gb files!". That might work if every recipient that the sender connects with can accept 1Gb files. Within a single company, for example. If not, then providing a service that no one/few people can take advantage of is a waste of time and money.

message edited by DerbyDad03


Reply ↓  Report •

Ask Question