Issam Najm and R. Rhodes Trussell
The development and implementation of water treatment technologies have been mostly driven by three primary factors: the discovery of new rarer contaminants, the promulgation of new water quality standards, and cost. For the first 75 years of this century, chemical clarification, granular media filtration, and chlorination were virtually the only treatment processes used in municipal water treatment. However, the past 20 years have seen a dramatic change in the water industry's approach to water treatment in which water utilities have started to seriously consider alternative treatment technologies to the traditional filtration/chlorination treatment approach. This paper identifies and discusses some of these "emerging" technologies.
Technology Development and Implementation Process
For a new technology to be considered it must have advantages over traditional treatment processes. These can include lower capital and operations and maintenance costs, higher efficiency, easier operation, better effluent water quality, and lower waste production. Nevertheless, for a water treatment technology to be accepted and implemented at large municipal scale, it must be demonstrated in stages. Understanding this process is necessary in order to properly plan and introduce a new technology to municipal water treatment. A typical sequence of these stages might be summarized as follows:
Stage 1: Successful demonstration in another field.
Stage 2: Testing and development at bench- and pilot-scale levels (1 to 50 gpm).
Stage 3: Verification at demonstration-scale level (>100 gpm).
Stage 4: Multiple successful installations and operations at small full-scale level (0.5 to 5 MGD).
Stage 5: Implementation at a large-scale municipal water treatment plant.
Two important milestones must be achieved in parallel with the above stages: obtaining regulatory approval and reducing costs to competitive levels. Commonly, regulatory approval is necessary by the end of the demonstration-
scale Verification stage (stage 3) and prior to installation at small full-scale plants (stage 4). However, for a new technology to reach full acceptance (stage 5), its cost must be competitive with that of other more conventional processes that achieve the same objective.
The time duration for each of the above stages can vary greatly depending on the technology being considered, how urgent it is to have it implemented, how long it takes for its cost to reach competitive levels, and the significance of its role in the overall water treatment train. The last factor is different from the others in that it recognizes the difference between a technology that is proposed as an alternative to filtration, for example, which is an essential component of water treatment, versus a technology that is proposed to replace a less important component such as a pump, automation, chemical feed, taste-and-odor control, or preoxidation.
Technologies Evaluated
A wide range of water treatment technologies have been developed or are currently in development. This paper focuses on technologies that can be applied in municipal water treatment plants. Such a technology should meet the following criteria:
- The technology can be scaled to large applications (i.e., > 5 MGD).
- The technology can be cost competitive with existing technologies at large scale.
- The technology can produce water that meets regulatory requirements.
- The technology has a high degree of reliability.
In this paper the following technologies are screened and evaluated: membrane filtration (low pressure and high pressure), ultraviolet irradiation, advanced oxidation, ion-exchange, and biological filtration. Many of these technologies are certainly not new to the water industry. However, either their application has been limited or they were introduced to the water industry so recently that many questions remain unanswered about their large-scale application.
Membrane Filtration Technology
There are two classes of membrane treatment systems that should be discussed: low-pressure membrane systems (such as microfiltration and ultrafiltration) and high-pressure membrane systems (such as nanofiltration and reverse osmosis). Low-pressure membranes, including microfiltration (MF) and ultrafiltration (UF), are operated at pressures ranging from 10 to 30 psi, whereas high-pressure membranes, including nanofiltration (NF) and reverse osmosis
(RO), are operated at pressures ranging from 75 to 250 psi. Figure 11-1 shows a schematic of the pore size of each membrane system as compared to the size of common water contaminants.
Low-Pressure Membranes
If there is a "Cinderella" story of a water treatment technology it is that of the application of low-pressure membranes for surface water treatment. The idea of using low-pressure membrane filtration for surface water treatment began developing in the early 1980s. At the time, low-pressure membranes had long been used in the food-processing industry as nonchemical disinfectants. During the latter half of the 1980s, several research projects were initiated by west coast water utilities (East Bay Municipal Utilities District and Contra Costa Water District), the American Water Works Association (AWWA) Research Foundation, and other organizations to evaluate MF and UF for municipal surface water treatment. The studies clearly showed that both MF membranes (with a nominal pore size of 0.2 mm and UF membranes (with a nominal pore size of 0.01 mm are highly capable of removing particulate matter (turbidity) and microorganisms. In fact, the research results showed that, when it came to these contaminants, membrane-treated water was of much better quality than that produced by the best conventional filtration plants. Figure 11-2 shows an example plot of turbidity removal by an MF membrane. The majority of treated-water samples had a turbidity level near the limit of the on-line turibidimeter (less than 0.05 Nephelometric Turbidity Units (NTU)). In addition, membrane filtration (both MF and UF) was proven to be an "absolute barrier" to Giardia cysts and Cryptosporidium oocysts when the membrane fibers and fittings were intact. Finally, the particular UF membranes tested by Jacangelo et al. (1995) were also proven to act as absolute barriers to viruses because of their nominal pore size of 0.01 mm.
As a surface water treatment technology, low-pressure membrane filtration has several advantages over conventional filtration and chlorination. These include smaller waste stream, lower chemical usage, smaller footprint, greater pathogen reduction, no disinfection byproduct formation, and more automation. For a while it was also believed that low-pressure membrane filtration is highly susceptible to excursions in raw water turbidity. However, pilot- and full-scale operational data have demonstrated that low-pressure membranes can treat turbidity excursions as high as several hundred NTUs with manageable impacts on process operation and efficiency (Yoo et al., 1995). All of the above advantages greatly favor membrane filtration over conventional filtration with chlorine.
On the other hand, because of their porous structure, low-pressure membranes are ineffective for the removal of dissolved organic matter. Therefore, color-causing organic matter, taste-and-odor-causing compounds such as Geosmin and methylisoborneol, and anthropogenic chemicals can pass through the membranes into treated water. This limits the applicability of low-pressure membrane filtration to surface water sources where the removal of organic matter is not required. One UF membrane system has overcome this
limitation by introducing powdered activated carbon (PAC) as part of the system. PAC injected into the influent water to the membrane is retained on the concentrate side of the membrane and disposed of with the waste stream. This approach is certain to expand the domain of low-pressure membrane applications in surface water treatment, especially at sites where organic removal is only occasionally required.
With all of these positive aspects, there were several obstacles that low-pressure membrane filtration had to overcome. First, for several years the cost of membrane filtration systems at "municipal" scale (i.e., greater than 1 MGD) was prohibitively high. Second, membrane filtration did not have regulatory acceptance and required extensive evaluation on a case-by-case basis. Third, information on its reliability in large-scale municipal applications was not available.
However, since the early 1990s, the cost of low-pressure membranes has decreased dramatically, which has made it more attractive to water utilities for full-scale implementation. In addition, a number of water utilities realized all the benefits that low-pressure membrane systems provided and decided to undergo the regulatory approval process to install these systems at relatively small and cost-effective scales. This has opened the door for the installation of increasingly larger low-pressure membrane plants. Until 1994, all MF or UF plants in the United States and around the world had capacities of less than 3 MGD. In 1994, the first large-scale MF plant (5 MGD) went on-line in San Jose, California, after undergoing significant testing to obtain California Department of Health Services approval. Since then the application of low-pressure membrane filtration has been on the rise. Figure 11-3 shows the recent profile of low-pressure membrane installation in North America in cumulative plant capacities. Today, membrane filtration is rapidly becoming accepted as a reliable water treatment technology. The California Department of Health Services has certified one MF membrane system for water treatment in the state, and has granted it 3-log Giardia removal credit and 0.5-log virus removal credit. It has also certified one UF membrane system and granted it 3-log Giardia removal credit and 4-log virus removal credit. Others are either being considered for certification or are actively undergoing the required testing. Membrane system construction costs are believed to be comparable to conventional plant construction costs up to a capacity of 20 MGD. However, this upper ceiling is rapidly rising. In fact, there are membrane plants being considered in the United States with capacities ranging from 30 to as high as 60 MGD.
High-Pressure Membranes
As noted earlier, included in this category are nanofiltration (NF) and reverse osmosis (RO) membranes. NF membranes are actually thin-film composite Re membranes that were developed specifically to cover the pore size between Re membranes (<1 nm) and UF membranes (>2 nm) (Matsuura, 1993)--hence the name nanofiltration. Thin-film composite (TFC) membranes are discussed later in this paper. The result was a type of membrane that operates
at higher flux and lower pressure than traditional cellulose acetate (CA) RO membranes. In fact, NF membranes are sometimes referred to as "loose" RO membranes and are typically used when high sodium rejection, which is achieved by RO membranes, is not required, but divalent ions (such as calcium and magnesium) are to be removed (Scott, 1995). Nevertheless, NF membranes are viewed by the water industry as a separate class of membranes than RO membranes and are discussed in this paper as such. NF membranes are commonly operated at pressures ranging from 75 to 150 psi (Lozier et al., 1997). NF membranes have been used successfully for groundwater softening since they achieve greater than 90 percent rejection of divalent ions such as calcium and magnesium. Several NF membrane-softening plants are currently in operation in the United States, with the first plant installed in Florida in 1977 (Conlon and McClellan, 1989). By 1996 the combined total capacity of NF plants in the United States was greater than 60 MGD, all in Florida (Bergman, 1996). It is estimated that approximately 150 NF membrane plants existed around the world by 1995, with a combined total capacity of approximately 160 MGD (Scott, 1995). Because most commercially available NF membranes have molecular weight cutoff values ranging from 200 to 500 daltons (Bergman, 1992; Scott, 1995), they are also capable of removing greater than 90 percent of natural
organic matter present in the water. Therefore, they are also excellent candidates for the removal of color and, more importantly, disinfection byproduct (DBP) precursor material (Taylor et el., 1987; Tan and Amy, 1989; Bleu et el., 1992; Chellam et al., 1997).
Currently, NF membranes are being considered as a total organic carbon (TOC) removal technology in surface water treatment. The idea is to install NF membranes downstream of media filtration in order to maintain a very low solids-loading rate on the membranes. Although NF membranes have been designated by the U. S. Environmental Protection Agency (EPA) as one of two best available technologies (BATs) for meeting stage 2 of the Disinfectants/Disinfection Byproducts Rule, they have not been applied for surface water treatment at full scale. To date, pilot studies have been conducted to evaluate the applicability of NF membrane filtration downstream of media filtration during surface water treatment with mixed results (Reiss and Taylor, 1991; Tooker and Robinson, 1996; Chellam et al., 1997). The study reported by Chellam et al. (1997) clearly demonstrated that the fouling rate of NF membranes downstream of conventional filtration was two times higher than that of NF membranes downstream of MF or UF membranes. This was supported by the study of Reiss and Taylor (1991), which showed that conventional filtration pretreatment did not reduce the fouling rate of NF membranes to acceptable levels. Nevertheless, the Information Collection Rule includes data gathering on the applicability of NF membrane filtration for TOC removal from surface water sources. The majority of the data will be from bench-scale testing, which does not include information on long-term operational design and reliability, but some data will be obtained from pilot-testing programs. These data will provide additional input into the viability of NF membranes for surface water treatment.
RO membranes have long been used for desalination of seawater around the world. These membranes can consistently remove about 99 percent of the total dissolved solids (TDSs) present in the water, including monovalent ions such as chloride, bromide, and sodium. However, for a long time these membranes were predominantly made from CA and required operating pressures at or greater than 250 psi. Recent innovations in Re membrane manufacturing have developed a new class of Re membranes, called TFC membranes that can achieve higher rejection of inorganic and organic contaminants than CA Re membranes while operating at substantially lower pressures (100 to 150 psi). In addition, CA Re membranes commonly require acid addition to lower the pH of the water to a range of 5.5 to 6.0 to avoid hydrolysis of the membrane material. TFC RO membranes do not hydrolyze at neutral or high pH and therefore do not require pH depression with acid addition. It should be noted that the need for pH depression for preventing the precipitation of salts on the membrane surface (such as CaCO3) may still be necessary in some cases depending on the quality of the water being treated and the availability of suitable antiscalents.
TFC RO membranes are currently being evaluated for water reclamation. Results from ongoing pilot studies have shown that TFC RO membranes can achieve greater than 90 to 95 percent rejection of nitrate and nitrite, compared to 50 to 70 percent removal with CA Re membranes. The same pilot studies also show that the TOC concentration in the effluent of TFC Re membranes can be as low as 25 to 50 g/L.
Because of their existing applications for water softening and seawater desalination, high-pressure membrane treatment is currently accepted by the regulatory community and the water industry as a reliable technology. The main obstacle to increased application of high-pressure membranes in municipal water treatment is their high cost. By nature of the current modular design of membrane systems, economies of scale are not recognized for large treatment plants. However, several membrane manufacturers are currently modifying their membrane system designs to make them economically attractive at large scale.
Two-Stage Membrane Filtration
From the above discussion it is apparent that low-pressure membranes are highly effective for particulate removal, while high-pressure membranes are effective for dissolved matter removal (both organic and inorganic). Conceptually, combination of the two membrane systems in series (MF or UF followed by NF or RO) would provide a comprehensive treatment process train that is capable of removing the vast majority of dissolved and suspended material present in water. Such a treatment train is commonly termed "two-stage membrane filtration." Other names include "integrated membrane systems" or "dual-stage membrane filtration." The only material that is believed to pass through such a treatment train includes low-molecular-weight organic chemicals. However, compared to existing treatment, a two-stage membrane filtration process (possibly coupled with PAC addition) would produce far superior water quality. The main concern about such highly treated water is that it may be more corrosive. Special corrosion inhibition measures for low-TDS waters of this kind require further development.
Several studies have been conducted to evaluate two-stage membrane systems for surface water treatment (Wiesner et al., 1994; Chellam et al., 1997; Kruithof et al., 1997; Vickers et al., 1997). The results of these studies have clearly shown that MF or UF membranes are excellent pretreatment processes to NF or RO membranes and that the combined particulate removal and organic removal capabilities of this treatment scheme produce excellent water quality that complies with existing and forthcoming regulatory requirements.
The primary obstacle that a two-stage membrane treatment system needs to overcome is its cost. Lozier et al. (1997) estimated the capital cost of a 40-gpm, two-membrane system at $4/gpd. The capital unit cost of a large-scale, two-stage membrane system may range from $2 to $3/gpd of capacity. This is still substantially higher than the cost of conventional treatment, which is estimated at $1 to $1.5/gpd.
Summary
Membrane filtration technology is rapidly becoming accepted in the water treatment industry. Low-pressure membrane filtration (MF and UF) is now replacing conventional filtration for surface water treatment at several locations in the United States. High-pressure membrane filtration (both NF and RO) is used primarily for softening and TDS reduction but is being evaluated for the removal of natural organic matter in water treatment. The main obstacle to large-scale implementation of membrane filtration is its capital cost. Ongoing innovations in the design of large-scale membrane systems are continually lowering their capital cost and making them increasingly cost competitive with conventional treatment processes.
Ultraviolet Irradiation Technology
Ultraviolet (UV) irradiation technology is primarily used in the water and wastewater treatment industry as a disinfection process that capitalizes on the germicidal effect of UV light in the wavelength range of 250 to 270 nm (EPA, 1996). The process is commonly designed such that water flows in a narrow region around a series of UV lamps. The microorganisms in the water are inactivated through exposure to the UV light. The process is compact since the time of exposure (which translates into hydraulic retention time) is commonly measured in seconds. The process works on the principle that UV energy disrupts the DNA of the microorganisms and prevents it from reproducing. UV irradiation technology has been used since the 1950s at approximately 500 drinking water facilities in the United States, and more than 1,500 facilities in Europe (Kruithof et al., 1992; Parrotta and Bekdash, 1998). However, the vast majority of the U.S. facilities are either transient-noncommunity groundwater systems or nontransient-noncommunity groundwater systems serving less than 3,000 people each. These are facilities that provide water to restaurants, highway rest areas, airports, schools, camps, factories, rest homes, and hospitals. In fact, UV disinfection technology in drinking water treatment is currently only promoted for small-scale groundwater systems. However, the process can certainly be scaled up to large-scale applications since it is currently applied at large-scale wastewater treatment plants for final effluent disinfection. The largest wastewater treatment UV system in the world is located in Edmonton, Alberta, Canada, with a peak design capacity of 265 MGD (Reed, 1998). For water treatment systems, a minimum UV dose is commonly set for UV systems. The National Sanitation Foundation (NSF) standard for Class A UV systems (i.e., those that can be used as point-of-use (POU) and point-of-entry (POE) treatment devices) requires that they emit a minimum UV dose of 38 mW-sec/cm2, which is the dose determined to inactivate Bacillus subtilis spores (ANSI/NSF, 1991). Several states, including New Jersey and Wisconsin, have specific criteria for UV systems in the form of a minimum dose (Parrotta and Bekdash, 1998). Several European countries have also adopted minimum UV doses for pretreated drinking water (Norway at 16 mW-sec/cm2 and Austria at 30 mW-sec/cm2). All of these doses are based on the requirement to inactivate bacteria and viruses but not protozoans. There is limited information on the ability of UV irradiation to
inactivate Giardia cysts. Karanis et al. (1992) conducted a laboratory study to evaluate the UV inactivation of Giardia lamblia cysts obtained from infected humans and gerbils. The testing results conducted in distilled water are shown in Figure 11-4. The results show that a UV dose of approximately 40 mW-sec/cm2 achieved 0.5-log inactivation of Giardia lamblia, whereas a UV dose of 180 mw-sec/cm2 was required to achieve 2-log inactivation of Giardia cysts. Rice and Hoff (1981) also showed that a UV dose of 63 mW-sec/cm2 achieved 0.5-log inactivation of Giardia cysts, also in distilled water. EPA has recently developed and published a guidance document for the application of UV technology for surface water treatment (EPA, 1997a). The California Department of Health Services has set a specific dose of 140 mW-sec/cm2 as a requirement to meet the Title 22 criteria of 2.2 coliforms/100 ml in reclaimed water.
Published information on the cost of UV disinfection systems in drinking water treatment is limited to small systems. EPA has estimated the capital cost of a UV treatment system for a 1.5-MGD plant at $200,000 (at a UV dose of 16 to 30 mW-sec/cm2). This translates into a capital unit cost of $0.13/gpd of capacity. The operations and maintenance cost of such a system is estimated at 1.5¢/1,000 gallons of water treated (Parrotta and Bekdash, 1998). On the other
hand, UV disinfection is commonly used in large-scale wastewater treatment plants. There, the cost of a 12-MGD UV treatment system designed to meet water reclamation standards is estimated at $1.5 million to $2 million. This estimate is for a medium-pressure UV system, treating water with a 55 percent transmittance (approximately 0.260 cm-1 UV-254 absorbance) and applying a dose of 140 mW-sec/cm 2. This is equivalent to a cost range of $0.13 to $0.16/gpd of capacity. These cost values indicate that the application of UV technology to large-scale water treatment is cost competitive. If UV irradiation can be proven effective against Cryptosporidium at reasonable doses (<200 mW-sec/cm2), it will become an attractive alternative to ozone, which is currently believed to be one of few effective disinfectants for inactivating Cryptosporidium.
There are four types of UV technologies of interest to the water industry: low-pressure, low-intensity (LP-LI) UV technology; low-pressure, medium-intensity (LP-MI) UV technology; medium-pressure, high-intensity (MP-HI) UV technology; and pulsed-UV (PUV) technology. Approximately 90 percent of the UV installations in North America have LP-LI UV technology, with some dating back to the 1970s. The power output of LP-LI UV lamps commonly varies from 40 to 85 W. Another unique characteristic of low-pressure lamps is that they emit a monochromatic light at a wavelength of 254 nm. EPA's design manual is specifically based on and tailored to LP-LI UV technology. The primary advantage of LP-LI UV lamps is their high efficiency. The primary disadvantage is their low power, which results in the need for a large number of lamps for a small plant. For example, a typical secondary wastewater effluent would require approximately 40 LP-LI UV lamps per MGD of peak capacity. Considering that a significant labor effort is required to clean and maintain UV lamps, the application of LP-LI UV technology at large scale is not desirable.
LP-MI UV lamps are identical to LP-LI UV lamps with the exception of a higher power output--170 W compared to 40 to 85 W. Therefore, a typical secondary wastewater effluent would now require only 20 to 24 lamps per MGD of capacity. This makes LP-MI UV technology more applicable for medium-size water treatment facilities than LP-LI UV technology.
MP-HI UV lamps operate at substantially higher gas pressure inside the lamps compared to low-pressure UV lamps and are characterized by a power output that varies from 5 to 30 KW. Contrary to low-pressure lamps that produce all of their light at approximately 254 nm, medium-pressure lamps produce a polychromatic light, of which only 25 percent is in the germicidal wavelength range of 200 to 300 nm. However, because of the higher power output of MP-HI UV lamps, UV disinfection systems using this technology are substantially smaller than those using LP-LI UV technology, simply because of the need for significantly fewer lamps. This technology has been used in small-scale water treatment and industrial applications since the 1980s. However, it was not introduced to the municipal wastewater market until 1994. Currently, more than 270 MP-HI UV systems are in operation, with 70 of them operating at municipal wastewater treatment plants. One drawback of MP-LI UV technology is its low power efficiency compared to low-pressure technology. Another drawback is its high capital cost. Typically, a low-pressure UV lamp costs about $500, whereas a medium-pressure UV lamp costs about $5,000. Nevertheless, considering the substantial savings in the number of lamps, both capital and operations and main-
tenance costs of large-scale MP-HI UV systems are lower than those of LP-LI UV systems. In fact, the general assumption in the industry is that UV systems with peak flow greater than 10 MGD should utilize MP-HI UV technology in order to keep the number of UV lamps at a manageable level.
Low- and medium-pressure UV technologies are past the research stage and have been accepted as reliable disinfection technologies. In fact, specific LP-LI UV doses are listed in the Surface Water Treatment Rule (SWTR) Guidance Manual (EPA, 1991) for the inactivation of viruses in water. The cost of UV systems is also not prohibitive since the technology is less expensive than ozone and many other disinfection processes.
The new UV technology under development is pulsed UV technology. In this process the energy is stored in a capacitor and then released to the lamp in a short, high-intensity pulse. The duration between pulses is approximately 30 milliseconds, and each pulse lasts for less than 1 millisecond. The intensity of each pulse is believed to be about 107 mW/cm2. One manufacturer of this technology claims that the high energy emitted with each pulse is far more effective for the inactivation of microorganisms compared to the same level of energy emitted over an extended period of time. Figure 11-5 shows a plot of the inactivation rate of MS2 bacterial virus with pulsed UV and LP-LI UV systems. The results suggest that, for the same UV dose, pulsed UV systems may achieve approximately one log additional kill of MS2 virus compared to that achieved by traditional LP-LI UV. However, questions remain about the ability to accurately measure the UV dose emitted by a pulsed UV system. Regardless of the type of UV technology used, the obstacles against application of this technology in municipal water treatment can be summarized as follows.
5G MOBILE PHONES
Q: What is 5G?
5G is the 5th generation mobile network. It is a new global wireless standard after 1G, 2G, 3G, and 4G networks. 5G enables a new kind of network that is designed to connect virtually everyone and everything together including machines, objects, and devices.
5G wireless technology is meant to deliver higher multi-Gbps peak data speeds, ultra low latency, more reliability, massive network capacity, increased availability, and a more uniform user experience to more users. Higher performance and improved efficiency empower new user experiences and connects new industries.
A: No one company or person owns 5G, but there are several companies within the mobile ecosystem that are contributing to bringing 5G to life. Qualcomm has played a major role in inventing the many foundational technologies that drive the industry forward and make up 5G, the next wireless standard.
We are at the heart of the 3rd Generation Partnership Project (3GPP), the industry organization that defines the global specifications for 3G UMTS (including HSPA), 4G LTE, and 5G technologies.
3GPP is driving many essential inventions across all aspects of 5G design, from the air interface to the service layer. Other 3GPP 5G members range from infrastructure vendors and component/device manufacturers to mobile network operators and vertical service providers.
5G is based on OFDM (Orthogonal frequency-division multiplexing), a method of modulating a digital signal across several different channels to reduce interference. 5G uses 5G NR air interface alongside OFDM principles. 5G also uses wider bandwidth technologies such as sub-6 GHz and mmWave.
Like 4G LTE, 5G OFDM operates based on the same mobile networking principles. However, the new 5G NR air interface can further enhance OFDM to deliver a much higher degree of flexibility and scalability. This could provide more 5G access to more people and things for a variety of different use cases.
5G will bring wider bandwidths by expanding the usage of spectrum resources, from sub-3 GHz used in 4G to 100 GHz and beyond. 5G can operate in both lower bands (e.g., sub-6 GHz) as well as mmWave (e.g., 24 GHz and up), which will bring extreme capacity, multi-Gbps throughput, and low latency.
5G is designed to not only deliver faster, better mobile broadband services compared to 4G LTE, but can also expand into new service areas such as mission-critical communications and connecting the massive IoT. This is enabled by many new 5G NR air interface design techniques, such as a new self-contained TDD subframe design.
Q: How and when will 5G affect the global economy?
A: 5G is driving global growth.
• $13.1 Trillion dollars of global economic output
• $22.8 Million new jobs created
• $265B global 5G CAPEX and R&D annually over the next 15 years
Through a landmark 5G Economy study, we found that 5G’s full economic effect will likely be realized across the globe by 2035—supporting a wide range of industries and potentially enabling up to $13.1 trillion worth of goods and services.
This impact is much greater than previous network generations. The development requirements of the new 5G network are also expanding beyond the traditional mobile networking players to industries such as the automotive industry.
The study also revealed that the 5G value chain (including OEMs, operators, content creators, app developers, and consumers) could alone support up to 22.8 million jobs, or more than one job for every person in Beijing, China. And there are many emerging and new applications that will still be defined in the future. Only time will tell what the full “5G effect” on the economy is going to be.
DELL VS HP
Dell and HP are two of the old dependable brands that most people consider when thinking about purchasing a computer or laptop. And as of 2021, they still offer some of the best options around.
Of course, with Apple becoming increasingly popular and Lenovo becoming the businessman’s laptop of choice, there are a lot more brands to consider than just these two.
However, many people have used these brands for years, and have become attached to them for several different reasons – customer service and lots of parts available being just a few.
But if you’re looking to purchase a Dell or a HP laptop, which is going to be the best option? Well, let’s take a look at the two brands to see how they compare.
Dell vs HP – Which brand is better?
Generally, Dell computers are some of the best available and are considered to be better than HP. Though HP have some great laptops, across their whole range there are many that can’t compete with other brands. Whereas Dell have a pretty great range of laptops across the board.
Both Dell and HP are two American companies, with HP being headquartered in California, whilst HP headquarters is in Texas. Although Dell is still relatively young (founded in the 80s), HP were formerly Hewlett-Packard, which was founded back around the time of the second world war.
Deciding which brand to opt for out of the two can be quite difficult. However, it’s not impossible. So, let’s take a look at the two brands and see what they are are made of.
Dell i7 G3 is the best having double fan .
0 Comments