Blog

Coleman Technologies Blog

We can give your organization comprehensive IT services and 24/7/365 live support for a predictable monthly fee. Stop stressing about technology, and start focusing on growing your business.

Pros and Cons of Leaning on a Wireless Network

The Wireless Connection

The Pros

There is one obvious benefit: No wires! Not having to run cable is a massive benefit, but the biggest benefit of this might just be the ability to connect devices to a wireless network inside your business. By giving your team access to network resources wirelessly, you’ll see better collaboration, improved productivity, and produce better products and services. 

Additionally, with a strong wireless network, you can promote some strategies that can work to improve your operational effectiveness. One of those strategies is a Bring Your Own Device (BYOD) strategy. Many of your employees bring their smartphones with them when they come to work. By enacting a BYOD strategy, your staff can take advantage of the devices they are most used to advance the goals of the company. 

The Cons

Even many wireless technologies aren’t actually wireless. Even the ones that are, need to be charged regularly, so while expanding your wireless network will provide the ability to compute inside the network’s perimeter, setting up a more collaborative workspace still comes with some drawbacks. Namely speed and security.

Wireless connections are more vulnerable than wired ones. It’s easier for unauthorized individuals to hijack the signal of a wireless connection and can provide a third-party that is looking to gain access, more of it to the critical information that is transmitted wirelessly. 

The Wired Connection

The Pros

When dealing with wired networks, IT admins have more control over what devices can connect to the network. This presents values several ways. First, there is more control over the security protocols on those devices, making contracting malware and other negative outcomes less likely. 

Wired connections also enhance an organization’s ability to keep their devices free from security threats. Controls have improved to the point where it is actually more difficult for attackers to break into a wired network.

Additionally, it may go without saying, but wired networks are overall faster than wireless networks. This speed boost is magnified if there are walls, floors, ceilings, or any other potential interference to seeing optimal speeds over Wi-Fi.

The Cons

The biggest setback to a wired Internet network is the act of wiring the network. Initial setup is a pain, as you need to hide cables and find ways to run cable as to not hinder the thoroughfares around your business. It is also a hindrance for maintenance if a cable fails or hardware has to be moved around due to business growth or restructuring. 

Another detriment to the business is that a wired connection doesn’t allow for the type of mobility many businesses are looking for nowadays. With a wireless connection meetings are faster, more to the point, and collaborative work can be fluid.

You have a business decision to make; and, while it may not be the most crucial one you will make, it can have an effect on how your business functions. For help networking your business, call the professionals at Coleman Technologies today at (604) 513-9428.

0 Comments
Continue reading

Taking a Long Look at Your Company’s Bandwidth Needs

Bandwidth Defined

Bandwidth is one of those terms that you think you understand until you try to explain it to someone else. Basically, bandwidth is how fast data can be transferred through a medium. In the case of the Internet, millions of bits need to be transferred from the web to network attached devices every second. The more bandwidth you have access to, the more data can be transferred. 

Speed vs Throughput

Network speed--that is, how fast you are able to send and receive data--is typically a combination of available bandwidth and a measure called latency. The higher a network’s latency, the slower the network is going to be, even on high-bandwidth network connections. Latency can come from many parts of the network connection: slow hardware, inefficient data packing, wireless connections, and others. 

Throughput is the measure of the amount of data that is transmitted through a connection. Also called payload rate, this is the effective ability for any data to be transmitted through a connection. So, while bandwidth is the presumed amount of data any connection can transfer, throughput is the amount of data that is actually transferred through the connection. The disparity in the two factors can come from several places, but typically the latency of the transmitting sources results in throughput being quite a bit less than the bandwidth. 

What Do You Need Bandwidth For?

The best way to describe this is to first consider how much data your business sends and receives. How many devices are transferring data? Is it just text files? Are there graphics and videos? Do you stream media? Do you host your website? Do you use any cloud-based platforms? Do you use video conferencing or any other hosted communications platform? All of these questions (and a few not mentioned) have to be asked so that your business can operate as intended. 

First, you need to calculate how many devices will connect to your network at the same time. Next, you need to consider the services that are being used. These can include:

  • Data backup
  • Cloud services
  • Email
  • File Sharing
  • Messaging
  • Online browsing
  • Social Media
  • Streaming audio
  • Streaming video
  • Interactive webinars
  • Uploads (files, images, video)
  • Video conferencing
  • Voice over Internet Protocol
  • Wi-Fi demands

...and more

After considering all the uses, you then need to take a hard look at what required bandwidth is needed for all of those tasks. Obviously, if you lean on your VoIP system, or you are constantly doing video webinars, you will need to factor those operational decisions into your bandwidth decision making. 

Finally, once you’ve pinpointed all the devices and tasks, the bandwidth each task takes, and how many people on your network do those tasks, you total up the traffic estimate. Can you make a realistic estimate with this information?  Depending on your business’ size and network traffic, you may not be able to get a workable figure. 

Too Much or Not Enough

Paying for too little bandwidth is a major problem, but so is paying for too much. Bandwidth, while more affordable than ever before, is still pretty expensive, and if you pay for too much bandwidth, you are wasting capital that you can never get back. 

That’s where the professionals come in. Coleman Technologies has knowledgeable technicians that can assess your bandwidth usage and work with your ISP to get you the right amount for your business’ usage. If you would like more information about bandwidth, its role in your business, or how to get the right amount for your needs, call us today at (604) 513-9428.

0 Comments
Continue reading

Tip of the Week: Fixing a Slow Internet Connection

You Don’t Have Enough Bandwidth

When you purchase an Internet package, you get certain speeds. Today, these speeds are faster than ever, but if your business has too much going on, it can wreak havoc with your Internet speeds. There is a situation that happens when too much data is trying to pass through a network connection. This situation is called bottlenecking and it is potentially the reason your speeds are slow. Think about it, if you try to put several gigabytes through a connection that is only rated for a few megabits per second, it’s going to take some time to get all the data through. To check this, audit how many devices are at work. Most of the time you’ll be surprised how much data you are sending and receiving. We can help you with this audit before you make the call to upgrade your Internet package.

Outdated Equipment

Another potential issue is that your networking equipment may simply be old and not be able to use the dual bands that are often necessary to get the most out of your wireless network. If you have enough bandwidth, but your Internet is just slow, chances are upgrading the modem, switches, or routers would be a prudent move and will likely fix any problems you have. 

Misconfigured Equipment and Environmental Factors

Once you’ve made sure that the physical components of your network are working as intended, but your Internet connection isn’t improving, you probably need to reconfigure your software on your devices or move your hardware to avoid interference. Specifically, if your wireless network signal is having problems making it through obstructions, you will want to consider using the 2.4 GHz connection rather than the 5.0 GHz channel. The max speed you’ll see will decrease, but the 2.4 GHz connection makes its way through obstructions better. Unfortunately, however, the 2.4 GHz signal can be a victim of electronic interference more than the 5.0 GHz channel. 

If you need help with your business’ networking, don’t wait and lose more money. Contact the professionals at Coleman Technologies today at (604) 513-9428.

0 Comments
Continue reading

URL Manipulation and What to Do About It

The URL

Before we get into the manipulation of the URL, let’s define its parts. 

The first part of the URL is called the protocol, which tells the computing network which language is being used to communicate on said network. Most of the time, the URL will use the protocol “HTTP”. The HyperText Transfer Protocol makes it possible to exchange web pages. Other protocols that are used include File Transfer Protocol, News, and Mailto. 

The second part of the URL is the ID and password, which makes it possible to access secure servers on the network. This part is typically removed because the password will be visible and transfer unencrypted over the computer network.

The third part of the URL is the server name. It allows users to access information stored on specific servers whether through a domain or the IP address associated with the server. 

The fourth part of the URL is the port number. This number is associated with a service and tells the server what type of resources are being requested. The default port is port 80, which can be left off the URL as long as the information that is being requested is associated with port 80.

Finally, the fifth, and last, part of the URL is the path. The path gives direct access to the resources found tied to the IP (or domain).

Manipulating the URL

By manipulating parts of the URL, a hacker can gain access to web pages found on servers that they wouldn’t normally have access to. Most users will visit a website and then use the links provided by the website. This will get them to where they need to go without much problem, but it creates their own perimeters.

When a hacker wants to test the site for vulnerabilities, he’ll start by manually modifying the parameters to try different values. If the web designer hasn’t anticipated this behavior, a hacker could potentially obtain access to a typically-protected part of the website. This trial and error method, where a hacker tests directories and file extensions randomly to find important information can be automated, allowing hackers to get through whole websites in seconds. 

With this method they can try searching for directories that make it possible to control the site, scripts that reveal information about the site, or for hidden files. 

Directory traversal attacks, also known as path traversal attacks, are also popular. This is where the hacker will modify the tree structure path in a URL to force a server to access unauthorized parts of the website. On vulnerable servers, hackers will be able to move through directories simply.

What You Can Do?

Securing your server against URL attacks is important. You need to ensure that all of your software is updated with the latest threat definitions, and keeping a detailed configuration will keep users in their lanes, even those who know all the tricks. 

The IT experts at Coleman Technologies can help you keep your business’ IT infrastructure from working against you. Call us today at (604) 513-9428 for more information about how to maintain your organization’s network security.

0 Comments
Continue reading

Looking Back at ARPANET

DoD Advanced Research

During the Cold War there was a constant need for coded systems to transmit data quickly. In the mid-1960s, the U.S. Department of Defense created what they called the Advanced Research Projects Agency (ARPA), which worked on integrating new technologies that would help the United States achieve its foreign policy goals. One of the scientists that was commissioned was Joseph Lickliter, who had the idea of connecting computers at important research centers. It was a way for engineers and intellectuals to collaborate on DoD-funded projects. The network, called ARPANET, was launched in 1969.

At first, growth was slow. Small packets were sent over telephone lines, but along the way there were many innovations that set the tone for the shared computing constructs that we regularly use today. One such innovation was packet-switching. Packet-switching allows a computer to connect to several other computers at once by sending individual packets of information. In this way, computers were able to constantly send and receive information. With this method each computer on ARPANET would have (what amounts to) an address book that is constantly updated. 

As the network grew, however, this packet switching model, which was beneficial, was just too slow to facilitate an accurate account of addresses on the system. So in 1973, the engineers at ARPA decided that Stanford University (a founding member) would keep a master address book that was kept up to date by network administrators. This decongested the network significantly.

By 1977, ARPANET had over 100 computers connected to it; and, with the age of personal computing starting to rear its head, changes started to come fast. It was about this time that other computing networks began to pop up. As they first started to connect with each other there was no interoperability between them, but this problem was remedied early in the 1980s with the standardization of what is called Transmission Control Protocol/Internet Protocol (TCP/IP). This was the first time the world Internet was used. 

ARPA engineers realized pretty quickly that the connecting networks that were now using the same protocol set (TCP/IP) were too numerous and were going to be unmanageable. This is when the modern Domain Name System (DNS) was introduced. They separated all addresses by domains. The first level, or top-level, domains would dictate the type of organization that a packet was being sent to. Examples include .com and .edu. Today, there are over 1,000 top-level domains out there. 

A second-level domain provided the host where data packets would be delivered. Examples that you see today are amazon.com or cornell.edu. This system provided specific data packet routing, setting the stage for the modern-day Internet. 

The Internet

By the late 1980s the DoD decided that ARPANET was a success and shut it down. It was handed off to a private company called NSFNET in 1990. In 1992, the modern Internet Service Provider (ISP) was created as the U.S. Congress passed a law allowing commercial traffic on the newly formed Internet. 

Nowadays, the United Nations has proclaimed that Internet service is now a fundamental human right. This marvel of human ingenuity would not have been possible without ARPA and ARPANET. If you would like to see more articles about technology’s history, subscribe to the Coleman Technologies blog today.

0 Comments
Continue reading

What Does Internet Rights Advocacy Mean?

Initially, the advocacy of Internet Rights was just that: the right to have access to the Internet. While this isn’t a problem for as many people as it once was, some places still don’t have fair, affordable access to high-speed Internet service. Some nations, despite providing access, have Internet laws that subdue use due to an overlaying censorship. This issue, and the monetization of collected consumer data, are two of the hot-button issues today for Internet Rights advocates.

Lead Up

The Internet is a relatively new technology, especially in the manner it is being used by people today. As a result, there are different views on how these technologies are disseminated, who profits from them, and how non-controlling entities have their rights repressed. As a result, you’ll find from the early days of Internet rights advocacy, the largest voices were from organizations that found the equitable portion of the Internet either unnecessary or repressive to the rights of consumers.

Notice that the access to the Internet was not even on the roadmap. The nature of the early commercial Internet was such that it could be successfully described as libertarian. Through the end of the 1990s, as the first round of dot com investments started to tank, it became obvious that the technology would end up bigger than anyone had anticipated and needed regulation.

In the U.S. many fights have been undertaken in the subsequent 20 years. Many of which were pushed by Internet rights advocates. One of the most famous is:

Reno v. American Civil Liberties Union (1997)

In an attempt to clean up what some people considered indecent content on the Internet (pornography and the like); and more accurately, to keep kids away from this content, Congress passed the Communications Decency Act. The ALCU, which is a well-known civil rights advocate group, filed suit. The provision was eliminated by two federal judges before being heard in front of the Supreme Court, which upheld the lower courts’ rulings. This was a major blow against censorship; paving the way for free expression on the Internet.

While the ALCU isn’t exactly an Internet Rights Advocate, the landmark case ushered in a new world of free speech on the Internet; and, it sets the tone for Internet rights advocates to this day.

Personal Privacy

Today there are many organizations looking to protect people on the Internet. Sometimes their views overlap, sometimes they don’t. One of these groups, the Electronic Frontier Foundation (EFF), is a major player in the fight to keep speech (and content) free from censorship on the Internet, the fight against the surveillance state, and most notably, the ongoing fight for individual privacy.

Businesses of all kinds, as well as government agencies have grown to take significant liberties with people’s personal information. Organizations like the ALCU and the EEF work tirelessly to get the topic of personal data privacy in front of decision makers.

Have you ever wondered how you just had a conversation with your friend via some type of app about fingerless gloves and now your sidebar on every website is now filled with fingerless glove ads? Most users don’t fully understand that organizations that you interact with online keep a profile on you. All of your actions, any personal or financial information that you share, and more is stored in a file that is often packaged and sold off by those organizations to advertising firms.

These advocates, among the other issues they stand up for, are trying to push the issue of personal data privacy. The main point of contention is that companies profit off of the information people provide, and since this information is very clearly personal in nature, it is their belief that individuals are being taken advantage of. This debate has been ratcheted up significantly with the European Union’s General Data Protection Regulation (GDPR) that intends to protect individual information.

While it might be a matter of time before the U.S. gets a data privacy law in the same vein as the GDPR, Internet rights advocates will continue to act in the public’s favor on this issue, and many others.

Net Neutrality & Access to All

One of the biggest fights that Internet rights advocates are undertaking is against the companies that deliver the Internet itself: The Internet service providers (ISP). For those of you who don’t know, over the past several years the U.S. Government created mandates that forced ISPs to provide access to applications and content without favoring any, even if they are the ones that use the most bandwidth.

The theory is that the typical Internet user only does so much on the web. They typically access the same sites and use their Internet connection for the same things. This creates a situation where ISPs, using market adjustments would want to get more money per byte than if users used a variety of sites to do the same. With federal control, they were forced into charging a flat rate.

The net neutrality laws that were instituted in 2015 were repealed in 2017, as controlling bureaucrats argued that there were enough people without fair access to the Internet and the only way to persuade the ISPs to commit to investing in infrastructure that would curb this problem is by repealing the net neutrality laws. Needless to say, this caused quite a stir.

Internet rights advocates were quick to point out investment in Infrastructure is in these ISP’s best interest and giving them the ability to slow down Internet speeds as they see fit is not good for consumers. Unfortunately for most Americans, these ISPs are the companies you have to get your Internet service from if you want speeds that allow you to use it the way you want. Advocates are still trying to do what they can to educate people about the benefits of net neutrality and have set up websites with information and for people to give their support. Organizations like the aforementioned ACLU and EFF,  the American Library Association, and Fight for the Future, Demand Progress, and Free Press Action currently sponsor www.battleforthenet.com, a one-stop site for all things net neutrality.

Advocacy can go a long way toward giving a voice to people who may not think they have one. What Internet-related topics do you find to be problematic? Leave your thoughts in the comments and subscribe to our blog.

0 Comments
Continue reading

Colleges Have a Lot of Data to Protect

Birth of the Internet

The first Internet was born on college campuses. It was built by intellectuals, for academics, without the massive list of considerations that now accompany software development. It spread quickly, of course, and somewhere, pretty early on, it was decided that by being able to support commerce, the Internet could become one of the west’s greatest inventions.

This came to fruition in 1984 when the first catalogue was launched on the Internet. This was followed by the first e-store (at books.com) in 1992, and the first software to be sold online (Ipswitch IMail Server) in 1994. Amazon and eBay launched the following year and the Internet has never been the same.

By then, the academic uses for the Internet had multiplied, as well. By the time Amazon launched, many colleges and universities were offering students access to the Internet as an important part of their continuing education. Boy, was it ever.

Today, you’ll be hard pressed to find a classroom (outside of the poorest school districts in the country) where every classroom isn’t Internet-ready.

College Internet Needs and Cybersecurity

This stands true in university and college circles, as well. Campuses today are almost completely connected. You’ll be hard pressed to find a place on a modern campus that, as long as you have security credentials to do so, you can’t gain access to an Internet connection. In a lot of ways, it is the demand for access that makes network security a major pain point for the modern college. Firstly, having to protect computing networks from a continuously variable amount of mobile devices is difficult. Secondly, the same attacks that plague businesses, are also hindering IT administrator efforts at colleges.

Colleges themselves aren’t doing anyone any favors. According to a 2018 report, none of the top 10 computer science degrees in the United States require a cybersecurity course to graduate. Of the top 50 computer science programs listed by Business Insider only three require some type of cybersecurity course. Moreover, only one school out of 122 reviewed by Business Insider requires the completion of three or more cybersecurity courses, the University of Alabama. Regardless of the metric, it’s clear that learning cybersecurity is not a priority for any school.

Are There Cybersecurity Problems Specific to Colleges?

The short answer is no. That’s why it's so important to get people thinking about cybersecurity any way they can. No industry can afford to have the skills gap between people that hack and the people looking to stop them grow any wider. This is why, no matter what you do (or plan on doing) for a living it’s important to understand what your responsibilities are and how to get them into a place that can help your organization ward off these threats from outside (and sometimes inside) your network.

Many colleges have turned to companies like Cyber Degrees to help them not only educate the people utilizing the college’s networks to why cybersecurity awareness is important, but also help people understand that with the rise of cybercrime and hacking-induced malware, that cybersecurity has become a major growth industry with many facets. In 2015, the Bureau of Labor Statistics found there were more than 200,000 unfilled cybersecurity jobs in the U.S. With curriculums not prioritizing cybersecurity, and with threats growing rapidly, imagine how many are unfilled today. As demand rises for competent individuals to fill a multitude of jobs in the computer-security industry, colleges need to do a better job prioritizing cybersecurity training.

For the business looking into protecting itself, look no further than the cybersecurity professionals at Coleman Technologies. Our knowledgeable technicians work with today’s business technology day-in and day-out and know all the industry’s best practices on how to keep you and your staff working productively, while limiting your exposure to risk. Call us today at (604) 513-9428 to learn more.

0 Comments
Continue reading

Tip of the Week: Bandwidth Questions

What is Bandwidth?
In its most basic form, bandwidth is how quickly you can download content from the Internet. Bandwidth is measured in megabits per second, or Mbps. The more bandwidth you have, the faster downloads will run. Some high-speed connections can be measured in Gigabits per second.

How Exactly Does Bandwidth Translate to Download Speed?
If you’re trying to calculate your projected download speed, keep in mind that there are eight bits for every byte. This means that if you’re trying to download eight megabytes of data on a one Mbps connection, it will take about one minute. A 512 megabyte file, on the other hand, would take just over a hour to download on the same connection.

Estimating Your Business’ Needs
In order to reach an appropriate estimate for your business’ bandwidth, you’ll need to use a little math. Take the estimated traffic that you expect each of your processes to take up, as well as the number of users that are engaged in this process. You’ll want to assume that this is during peak operations; otherwise you might not have enough during an important operational period. You can generally rely on the following speeds for bandwidth estimation:

  • 100Kbps and below: Low-end, single-line VoIP phones and e-fax machines. Some more basic computers have processes that use less than 100Kbps, but in the business world, you probably aren’t using them.
  • 100-500Kbps: More computers and laptops fall into this range, as they are more likely to be the ones streaming, downloading, emailing, and browsing than other less intensive devices.
  • 500Kbps-2.0Mbps: Cloud solutions and standard definition video conferencing take up about this much bandwidth. This is the general range for Enterprise Resource Planning solutions, Customer Relationship Management platforms, and Point of Sale devices.
  • 2.0Mbps and more: High-definition conferencing solutions, remote access, heavy cloud access, and other resource-intensive tasks fall under this category.

If you keep peak activities at the top of your mind, use them to add up what your staff will need to stay on task and ahead of schedule. For example, let’s say you have ten users, including yourself. You might be using 450Kbps for correspondence, while six of your employees are using a CRM solution at 2.0Mbps each. The last three are using high-definition video conferencing software for 2.5Mbps each. Add all this up and you can expect to use about 20Mbps at heaviest use, but you want to go a little beyond this to 25Mbps, just to be safe.

What are other tips that you might want us to share? Leave us a comment and let us know.

0 Comments
Continue reading

ISPs Have Finally Started Rolling Out 5G

To begin, we have to say that anyone that is talking about 5G as a selling point for any product or service in early 2019, you are looking at a marketing strategy. As the year goes on, however, we will be getting the first 5G networks and devices that can run on them. So, while it is true that some 5G wireless networks will be online in 2019, the lion’s share of networks will be using the same wireless platform that you’ve had for the past decade.

Fourth Generation

In December 2009, 4G wireless broadband networks went online for the first time, with the U.S. finally getting 4G LTE in June of 2010. Much like 5G will be for us, 4G was a revelation for mobile consumers of the time. At the time, 3G networks were doing 200 kilobits-to-five megabits per second and boosting to 4G’s 100 megabits-to-a-gigabit per second is a huge jump. It allowed for cloud computing and streaming media to be possible, and opened up a massive market for mobile applications, and devices that could handle these applications. The effect 4G had on society was massive.

Fifth Generation

Like the 4G networks before it, 5G networks will improve bandwidth speed, reduce latency, and provide a whole new layer of application support. It will effectively bring office Internet speeds to mobile devices. With speeds up to 100 gigabits per second, the 5G connection will be nearly 1,000 times faster than current 4G speeds. This will make any streaming communications seamless and give application developers a whole new construct to work in, improving mobile computing with each new innovation. To put this into perspective, with a solid 5G connection, you could, in theory, download a full movie in a few seconds

This gives people better network stability to ensure that business-critical mobile functions are reliable and has the speeds necessary to provide users the digital tools they need to be productive anywhere they are. The problem many organizations (and individuals) will have is that with this amazing upgrade, when your area gets 5G (Verizon has announced they are launching their 5G the second week in April in Minnesota and Chicago) you will have to get a phone that is capable of working with the new 5G networks. Thus far here is the list:

  • Samsung Galaxy s10 5G
  • LG v50 ThinQ
  • Huawei Mate X
  • ZTE Axon 10 Pro 5G
  • Unnamed OnePlus 5G Smartphone

That’s it. No Apple version. No Eurpoean version. In fact, of these phones, the Galaxy s10 5G is the only one that will be available in the United States at the time of Verizon’s 5G launch in April.

Also, Verizon is also making available a 5G mod for the Moto z3 for $50. The retail price comes in at a cool $349.99 if it isn’t purchased in the promotional period.

Users should also know that if they are lucky enough to get access to a 5G network, they will be forced to pay extra for access to it. The price currently is an additional $120 per year on Verizon, other Telecoms haven’t announced a true 5G pricing strategy, even though most of them have rolled out some products claiming to have 5G capabilities on handsets that only offer 4G LTE speeds. As we stated above, these are marketing ploys.

Do you plan on using 5G when it’s rolled out? Leave your thoughts in the comments section below. If you would like to know more information about the technology behind 5G subscribe to our blog today.

0 Comments
Continue reading

Updating the Whole Net Neutrality Situation

Commercially available Internet services have been available since the early 1990s, but as broadband was being implemented, the Internet, and investment in the medium was strong. In an attempt to keep control of the Internet distributed among the people that utilize the service, and not massive corporations looking to gain control over it, the Federal Communications Commission (FCC) under chair Kathleen Abernathy adopted neutrality principles “to preserve and promote the vibrant and open character of the Internet as the telecommunications marketplace enters the broadband age” in 2005.

For seven years, lawmakers attempted to pass bills in Congress that would secure an open future for the Internet. All of these attempts failed, leaving the future of who would control the Internet up in the air. The fear was that ISPs, which are typically huge multinational conglomerates, would be able to control bandwidth with cost, as they do with their television services. Internet freedom advocates considered the price discrimination that would arise from “local monopolies enshrined in law” to be at the helm of what has proven to be the most remarkable invention in human history, counterproductive for the establishment of an open and useful construct.

Years of litigation followed. Cases such as Verizon Communications Inc. vs. FCC, which ruled that the FCC had no regulatory power over the Internet because it was, in fact, not actually a utility, and thus, governed under Title I of the Communications Act of 1934. Immediately after this ruling, the FCC took steps to reclassify Internet delivery services into a public utility, which are governed under Title II of the Act. In February of 2015, the classifications were officially challenged as voting members agreed that Internet services met the criteria of a utility under Title II of the Communications Act of 1934 and the more recent Telecommunications Act of 1996. In April of 2015 “net neutrality” was upheld by officially declaring Internet services as a utility. The rules officially went into effect the following June.

The “final rule” turned out to be short lived, however. In April of 2017, the FCC proposed to repeal the policies that governed net neutrality, and return control to the corporations that invest in and provide broadband services. The proposed changes were met with heavy consternation, with over 20 million people providing comments during the public discourse phase of the process. It was later found that millions of the comments made in support of net neutrality repeal were made fraudulently by foreign actors. Despite the overwhelming dissention of the mass of people, the FCC repealed the net neutrality policies and followed it with a hefty amount of propaganda material claiming that the decision was “restoring Internet freedom”. The repeal became official in June of 2018.

What Is Going on with Net Neutrality Now?
Almost immediately after the change was made there have been several lawsuits filed and they seem to keep coming. States, advocacy groups, neutrality lobbies, and companies have all started lawsuits against the FCC both for their handling of the situation and for the repeal of net neutrality itself.

One way to ascertain if it has been a benefit is by looking at the claims the FCC made before dismantling the mandate:

  1. Net Neutrality is hindering broadband investment. In 2018 what is known as the Big Four--Verizon, AT&T, Charter, and Comcast--collectively spent less in broadband projects than they did in 2017. It was the first time in three years that investment has dropped.
  2. It doesn’t make sense for ISPs to throttle Internet traffic. The Big Four reportedly slowed internet traffic without telling customers not more than six weeks after the repeal. Sites like YouTube, Netflix, and Amazon Prime were the most targeted. Verizon was especially culpable as it was found to slow data speeds that led to slower EMS response times; a major problem as firefighters were battling massive fires in California.

The issue isn’t totally devoid of common ground, however. Almost everyone believes that ISPs shouldn’t be able to flex their muscles, so to speak. One way this is happening is that there is a push to restore older FCC mandates that prohibited ISPs to enact anticompetitive and harmful practices. Basically, everyone wants a fast, open, and unobstructed Internet, but the disagreement, usually on party lines, is who is responsible for the regulation.

An extreme majority of people support net neutrality. Most people want to return oversight over the Internet to the bureaucracy, as they believe that corporations whose stated purpose is to make profit aren’t the best organizations to manage something as important as access to the Internet, despite being the companies that sell that access. Time will tell who is right.

If you would like to do something about it, go to https://www.battleforthenet.com/ and sign up. Do you believe market forces will keep ISPs honest, and the Internet open? Leave your thoughts in the comments section below.

0 Comments
Continue reading

The Good, Bad, and Ugly of the Internet

The Good
Let’s start with the resoundingly positive attributes of the Internet. Firstly, it makes life extraordinarily easier. Banking, shopping, and direct communication with other individuals and businesses are all simpler and faster. People can get more done in a shorter amount of time. It makes people smarter by providing them access to a knowledge base unprecedented in human history. It provides the opportunity to connect with like-minded people from anywhere in the world at minimal cost, giving people the ability to do wonderful things for others whom they may have never met. It provides businesses and individuals, alike, the access to better opportunities, more knowledge, and interactions with people that matter to them.

Speaking of business, it has changed things for entrepreneurs precipitously. Data storage and retrieval is faster. Cloud platforms of all types offer software, hardware, security, and development platforms that reduces the enormous capital costs many organizations were spending on their IT. It gives organizations access to a glut of resources, no more important than a growing mobile workforce that is available around the clock, promoting better productivity. It provides the opportunity to streamline all types of work, whether it be reducing face-to-face interactions with your vendors, or utilizing tracking software that helps administrators build more efficient business practices.

The Internet has provided a social outlet to people who didn’t have one. The use of social media has revolutionized the way people share and communicate. Each person has the freedom to do whatever they choose online, and often this results in positive action. Many important groups that have been marginalized for one reason or another are now able to promote their platforms thoroughly.

The Bad
There are some things about the Internet that many people can give or take. In fact, for every benefit listed above, there is a drawback. The easier access to information opens the door for more misinformation. For all the ease of banking, shopping, and communication there are threat actors looking to steal resources and personal information for profit. For every like-minded person that you meet, you meet all manners of Internet trolls and other unattractive people.

Social media has had an amazing amount of influence, but for all the good that it does, it also promotes individual freedom from convention, sure, but also creates what is known as a “toxic mirror” effect. This is the concept of making people feel bad about themselves by constantly being exposed to information that would make them create negative opinions about themselves. The toxic mirror makes anything that isn’t physical, emotional, and mental perfection, ugly and bad.

Beyond the toxic mirror, many people use social media in ways that hurt the people around them. The manifestation of a social persona can often present the opportunity for a user to put out very public misinformation. This break from reality, further muddies people’s ability to properly identify risk, putting them in harmful situations. The Internet is filled with trolls, stalkers, and bullies. These groups are allowed to run rampant, as people don’t have a lot of resources to ward against them. These individuals hide behind their Internet persona, making civil action against them extremely difficult. Cyberbullying, specifically, can cause great harm to people of all ages.

For the business, the Internet is a true double-edged sword. On one hand if you don’t utilize its features, you could be hindering the manner in which you conduct business, since more people are exposed to your business on the Internet than in any other place. A problem with this is that you then have to spend a lot of advertising capital to try and get your business exposed to potential customers. For some businesses this may be advantageous, but for the lion’s share of businesses, it increases the capital that is required without any assurance that it will provide additional sales.

The Ugly
The Internet is actually a pretty dangerous place; and, it’s a lot bigger than people think. While the usable part of the Internet is catalogued by most of the major search engines, there is a massive part of the Internet that is filled to the brim with risky behaviors. The deep web, and more specifically, the dark web, is filled with problematic content. While users can’t just access this part of the Internet, the people that do are often the hackers and dissidents of the world. Some are evil, some just unfortunate, but most of the dark web is filled with a black market that makes available goods and services that the average person has no use for. Murder for hire? Check. Drug catalogues? Check. Hacking resources? Check. It’s essentially an anti-social person’s playground filled with hate, and illegal material. Think of the dark web as a city. It just so happens that some places in that city (like many other cities) are very dangerous, and while you may just find something you can’t find anywhere else, staying far away is a good way to avoid the negatives altogether.

For the business, the ugliest part of the Internet are the countless hacking collectives and individual hackers that are almost constantly trying to gain access to their network. Computer viruses and other malware, including ransomware are such a big threat that businesses spend billions and billions of dollars a year trying to protect themselves and their clients from people looking to steal their data and sell it off.

The Internet is a lot of good to a lot of people, but as more derision, more hate, more criminal behavior, and more strategic subversion happen on the Internet, the more it becomes something it was never intended to be. The saving grace is the hundreds of millions of users that still use the Internet to make their lives, and the lives of people around them, better.

The IT professionals at Coleman Technologies are serious about making others’ lives better. If your business wants to utilize the good and secure itself against the bad and the ugly, contact us today at (604) 513-9428. We can help your organization protect your data and scan the dark web to see if any of your accounts (or your employees’ accounts) were already stolen and leaked on the dark web.

0 Comments
Continue reading

An Introduction to the Dark Web

In the west, the Dark Web is mostly known as a sinister network used to traffic in all kinds of illegal contraband, but in other parts of the world it is often looked upon as the last bastion of privacy in what can be horribly repressive political regimes. Overall, the Dark Web in practice is a construct that supports user anonymity.

Keep in mind that we are certainly not endorsing use of the Dark Web. We just want you to be aware that the Internet you routinely access, or what we will call the surface web in this blog, is in fact a very small piece of the enormity of the web itself.

A Complete Look at the World Wide Web
If the Internet that we can all access only makes up of a very small percentage of the entire Internet, what is hosted on the rest of it? In what is known as the “Deep Web”, most of the Internet is filled with legitimate data; mostly in the form of unindexed content. Data that is encrypted such as online banking, pay-to-play video services, and other forms of everyday Internet use make up a large portion of the Deep Web. With the revelations that there was an online black market where people could get almost anything, many people started confusing the deep web with the dark web, or darknet. This misconception has many people confused about what exactly the purpose is for the seemingly bottomless Internet, but with most of it being taken up by cloud environments and other encrypted services, the notion that the Deep Web is somehow nefarious is misplaced.

What is the Dark Web?
On the other hand, the Dark Web is also hosted on the Deep Web, beyond sight of the average Internet user. While the surface web is unencrypted and able to be accessed by just about anyone who wants to use it, the Dark Web is accessed only through encrypted browsers. You may have heard of specific ransomware programs asking you to download the Tor web browser to make payments. This is because Tor is one of the web browsers able to browse the Dark Web, although it should be mentioned that it’s not exclusively used for the purposes of paying ransomware demands.

Tor is what is known as an onion router. Essentially in order to maintain a user’s anonymity, an onion router will pass user queries through several intermediary accounts to hide the user from being tracked. It’s like passing each command through the several layers of an onion, thus the moniker.

What Else is On the Dark Web?
The services offered on the Dark Web are varied, but they all generally have one thing in common--most of them are illegal. If you can think of it, and it’s not on the normal World Wide Web, chances are there is a place on the Dark Web for it. Some of the services provided on the Dark Web can include, but are not limited to, the following:

  • Illegal pornography
  • Bitcoin services (not outright illegal, but often used for money laundering purposes)
  • Botnets that can be bought or purchased for nefarious use
  • Markets for drugs, weapons, and other illegal contraband
  • Scams and other phishing threats are rampant on the Dark Web, so even those who are looking to take advantage of these services have to be careful

Most notable for businesses is that hacking services can be acquired for even non-experienced users, meaning that anyone with an agenda has access to services that could cripple your business. It’s more important today than ever before to make sure that your organization is taking the necessary measures to protect itself from these threats.

With so much information hidden from view, there is a significant chance that there may be information out there that may end up becoming problematic for your business. At Coleman Technologies, we can scan to ensure that your passwords or other personal information isn’t readily available. Call us today to learn more at (604) 513-9428.

0 Comments
Continue reading

Customer Login

News & Updates

When it comes to growth and advancement, small and medium-sized businesses (SMBs) in British Columbia have been effectively leveraging Coleman Technologies’ affordable enterprise-level IT practices and solutions since 1999. The proof: Coleman Technol...

Contact us

Learn more about what Coleman Technologies can do for your business.

Coleman Technologies Inc.
20178 96 Avenue, C400
Langley, British Columbia V1M 0B2

Operations Center
6600 Chase Oaks Blvd, Suite 100 Plano
TX 75023

 

2 year badge

Copyright Coleman Technologies. All Rights Reserved. Privacy Policy