34% off TurboTax Deluxe 2016 Tax Software Federal & State - Deal Alert

34% off TurboTax Deluxe 2016 Tax Software Federal & State - Deal Alert

TurboTax coaches you every step of the way and double checks your return as you go to handle even the toughest tax situations, so you can be confident you’re getting every dollar you deserve. Its typical list price of $59.99 has been reduced a generous 34% to $39.86, a deal that is exclusive to Amazon. Learn more, or take advantage of the deal now, on Amazon.

This story, "34% off TurboTax Deluxe 2016 Tax Software Federal & State - Deal Alert" was originally published by TechConnect.

Tech companies like Privacy Shield but worry about legal challenges

Tech companies like Privacy Shield but worry about legal challenges

The future of data transfers between the EU and US is uncertain, companies say

Privacy Shield, the new international framework allowing companies to transfer customer data between the EU and the U.S., is getting good reviews so far, but some companies aren’t betting on it for the long term.

Companies using Privacy Shield worry that it may face the same fate as long-used predecessor the Safe Harbor Framework, which was overturned by the European Court of Justice in October 2015 after revelations of mass surveillance by the U.S National Security Agency. 

Digital Rights Ireland and French civil liberties group La Quadrature du Net have also challenged Privacy Shield in court, saying the new framework doesn’t adequately protect Europeans’ privacy.

While U.S. companies are embracing Privacy Shield, many European businesses are “still concerned that Privacy Shield will not hold up under court scrutiny, and they will find themselves in the same scenario as they were in October 2015, when the Safe Harbor agreement was struck down,” said Deema Frei, global privacy officer at Intralinks, a New York cloud-based content collaboration provider.

Some European companies see Privacy Shield certification as a “tick box” compliance exercise, she added. With some doubts about its long-term viability, companies should also consider other data transfer agreements, such as EU model clauses or binding corporate rules, she recommended.

However, if companies can get certainty about Privacy Shield’s future, and if it won’t be “attacked in the long term by data privacy activists trying to discredit it and challenge its validity, I believe it will work in the long run,” Frei added. 

More than 1,100 users

As of early December, about five months after Privacy Shield went into effect, about 1,150 U.S. companies had signed up to handle European customer data under Privacy Shield, up from about 500 at the end of September. Another 600 U.S. companies had applications under review.

Those numbers compare to more than 4,500 U.S. companies that had participated in the Safe Harbor data-transfer program, according to the U.S. Department of Commerce.

Like Intralinks, cloud security firm CipherCloud is worried about the legal challenges to Privacy Shield, said David Berman, senior product marketing manager there.

“If a European Court decision does invalidate Privacy Shield, there will be another period of uncertainty” similar to what happened after the Safe Harbor agreement was struck down, he said. “If the new framework can withstand legal challenges it should continue to attract companies that want an overarching mechanism to transfer EU data to the U.S.”

Small and medium-size businesses, as well as cloud providers, seem to be embracing Privacy Shield, but the new data transfer rules impose more obligations than the old agreement, Berman said. 

“Privacy Shield has more privacy protections for individuals than Safe Harbor, so firms will have to be more diligent and ensure they are complying with the new privacy principles or risk public disclosure of a violation by the U.S. Department of Commerce,” he said. “Some firms may find the increased oversight, additional requirements, and sanctions for non-compliance under Privacy Shield a barrier to adoption.”

Compliance and surveillance

With the number of Privacy Shield companies still lagging behind those that used Safe Harbor, this could indicate that Privacy Shield is more difficult to comply with, added Elodie Dowling, corporate vice president and general counsel for Europe, the Middle East, and Africa at BMC Software.

In addition to the legal challenges, some EU data privacy regulators have suggested that Privacy Shield “does not do enough to curtail U.S. surveillance,” Dowling added. EU privacy regulators will review the agreement in 2017.

The legal challenges may be only beginning, she added. Max Schrems, the Austrian man who led the fight against Safe Harbor, has questioned how 500 companies received certification in the first month Privacy Shield was available.

“This is undoubtedly showing that there are serious concerns around ... Privacy Shield and its ability to indeed protect EU citizen’s fundamental right of privacy when their personal data is being transferred to the U.S.,” Dowling said.

BMC has not yet signed up for Privacy Shield, instead deciding to “rely on another mechanism to safely and legally transfer personal data outside of the EU anywhere in the world”—through binding corporate rules.

For Privacy Shield to succeed, it needs support from the EU, including the data protection authorities in each member state, added David Hoffman, Intel’s associate general counsel and global privacy officer.

Intel supports the new agreement but wants to keep other mechanisms, such as binding corporate rules, in place as well, he said.

If data transfers are between subsidiaries of the same company, companies can use binding corporate rules to define the data responsibilities. As an alternative to Privacy Shield, companies can protect external transfers through model contract clauses restricting what the receiving company may do with the data. 

But companies are concerned about the future of those alternate data transfer methods as well, Hoffman said. While Privacy Shield and alternative transfer methods are in place for now, the future is uncertain.

“Some of the same arguments about Safe Harbor and Privacy Shield can be made about alternative transfer methods,” he said. “If there are concerns about law enforcement and national security agencies accessing information, then there would be the same concerns about alternative methods because those agencies can also access it when it’s transferred by other means.”

Nmap security scanner gets new scripts, performance boosts

Nmap security scanner gets new scripts, performance boosts

Nmap 7.40 has new scripts that give IT administrators improved network mapping and port scanning capabilities

The Nmap Project just released the Holiday Edition of its open source cross-platform security scanner and network mapper, with several important improvements and bug fixes.

New features in Nmap 7.40 include Npcap 0.78r5, for adding driver signing updates to work with Windows 10 Anniversary Update; faster brute-force authentication cracking; and new scripts for Nmap Script Engine, the project’s maintainer Fyodor wrote on the Nmap mailing list.

The de facto standard network mapping and port scanning tool, Nmap (Network Mapper) Security Scanner is widely used by IT and security administrators for network mapping, port-scanning, and network vulnerability testing. Administrators can run Nmap against the network to find open ports, determine what hosts are available on the network, identify what services those hosts are offering, and detect any network information leaked, such as the type of packet filters and firewalls in use.

With a network map, administrators can spot unauthorized devices, ports that shouldn’t be open, or users running unauthorized services.

The Nmap Scripting Engine (NSE) built into Nmap runs scripts to scan for well-known vulnerabilities in the network infrastructure. Nmap 7.40 includes 12 new NSE scripts, bringing the total to 552 scripts, and makes several changes to existing scripts and libraries. The ssl-google-cert-catalog script has also been removed from NSE, since Google is no longer supporting the service. Known Diffie-Hellman parameters for haproxy, postfix, and IronPort have been added to ssl-dh-params script in NSE.

A bug in mysql.lua that caused authentication failures in mysql-brute and other scripts (affecting Nmap 7.52Beta2 and later) have been fixed, along with a crash issue in smb.lua when using smb-ls. The http.lua script now allows processing HTTP responses with malformed header names.

The script http-default-accounts, which tests default credentials used by a variety of web applications and devices against a target, adds 21 new fingerprints and changes the way output is displayed. The script http-form-brute adds content management system Drupal to the set of web applications it can brute force. The brute.lua script has been improved to use resources more efficiently.

New scripts added to NSE include fingerprint-strings, to print the ASCII strings found in service fingerprints for unidentified services; ssl-cert-intaddr, to search for private addresses in TLS certificate fields and extensions; tso-enum, to enumerate usernames for TN3270 Telnet emulators; and tso-brute, which brute-forces passwords for TN3270 Telnet services.

Nmap 7.40 adds 149 IPv4 operating system fingerprints, bringing the current total to 5,336 OS fingerprints. These fingerprints let Nmap identify the operating system installed on the machine being scanned, and the list includes a wide range of hardware from various vendors. The latest additions are Linux 4.6, macOS 10.12 Sierra, and NetBSD 7.0. The Amazon Fire OS was removed from the list of OS fingerprints because “it was basically indistinguishable from Android.”

Nmap also maintains a list of service fingerprints so that it can easily detect different types of services running on the machine. Nmap now detects 1,161 protocols, including airserv-ng, domaintime, rhpp, and usher. The fingerprints help speed up overall scan times.

Nmap 7.40 also adds service probe and UDP payload for Quick UDP Internet Connection, a secure transport developed by Google that is used with HTTP/2.

A common issue when running a network scan is the time it takes to complete when some of the ports are unresponsive. A new option—defeat-icmp-ratelimit—will label unresponsive ports as “closed|filtered” in order to reduce overall UDP scan times. Those unresponsive ports may be open, but by marking the port this way, administrators know those ports require additional investigation.

Source code and binary packages for Linux, Windows, and MacOS are available from the Nmap Project page.

VMware removes hard-coded root access key from vSphere Data Protection

VMware removes hard-coded root access key from vSphere Data Protection

The company also fixed a stored cross-site scripting flaw in ESXi

VMware has released a hotfix for vSphere Data Protection (VDP) to change a hard-coded SSH key that could allow remote attackers to gain root access to the virtual appliance.

VDP is a disk-based backup and recovery product that runs as an open virtual appliance (OVA). It integrates with the VMware vCenter Server and provides centralized management of backup jobs for up to 100 virtual machines.

According to a VMware support article, the vSphere Data Protection (VDP) appliance contains a static SSH private key with a known password. This key allows interoperability with EMC Avamar, a deduplication backup and recovery software solution, and is pre-configured on the VDP as an AuthorizedKey.

“An attacker with access to the internal network, may abuse this to access the appliance with root privileges and further to perform a complete compromise,” VMware said.

The company rates this vulnerability as critical and developed a hotfix that can be copied and executed on the appliance to change the default SSH keys and set a new password.

Developing devices with hard-coded access credentials that users can’t change is a serious security weakness. Unfortunately, this was common practice in the past and vendors have been trying to clean up such mistakes from their devices for the past few years.

On Tuesday, VMware also fixed a stored cross-site scripting vulnerability in its vSphere Hypervisor (ESXi) product. The flaw is rated as important.

“The issue can be introduced by an attacker that has permission to manage virtual machines through ESXi Host Client or by tricking the vSphere administrator to import a specially crafted VM,” the company said in an advisory. “The issue may be triggered on the system from where ESXi Host Client is used to manage the specially crafted VM.”

VMware released security fixes for ESXi 5.5 and 6.0 to fix this flaw and advises users not to import VMs from untrusted sources.

Don't install this Windows patch: Intel System 8/19/2016 12:00:00 AM 10.1.2.80

Don't install this Windows patch: Intel System 8/19/2016 12:00:00 AM 10.1.2.80

Here are the details on the undocumented patch that mysteriously appeared yesterday

Yesterday, I started receiving reports of a recommended update that suddenly appeared in the Windows Update listing for some Windows 7 and 8.1 machines. (As "recommended," it may appear in your Windows Update Optional list, or in your Important list.) There's no KB number, which means you can't uninstall it via the "Uninstall an update" dialog, and links from Windows Update turned up nonexistent pages.  

Running a search for "8/19/2016 10.1.2.80" through the Windows Update Catalog results in 55 different downloads, all of which appear to be identical. They all have the same filename, and a random hex file comparison came up with no differences (thx to td and DougCuk). The description in the Update Catalog says it's an "INTEL USB driver update released in August 2016," and individual files are for a wide variety of processors and USB Enhanced Host Controller types.

The closest driver update I could find on the Intel site is the "Intel(R) Server Chipset Driver for Windows" version 10.1.2.77, dated Aug. 29. The dates don't line up, the version numbers don't jibe (10.1.2.77 on the Intel site, 10.1.2.80 in Windows Update), and the size is wrong (the Intel download is 2.71 MB, where the Windows Update download is 67 KB).

AskWoody poster John Hillig, referencing the Viper site, says:

  • Intel Chipset INF 10.1.2.77 -- 08/03/16 Is Not WHQL and has the chipset type CAT/INF files packaged into Intels SetupChipset.exe stand alone installer.
  • Intel Chipset INF 10.1.2.80 -- 08/19/16 via Windows Update Is WHQL and is packaged as separate chipset type CAT/INF files for install by Windows INF installer.

Which explains the differences in version numbers, dates, and file sizes.

Overnight, Windows guru Günter Born took apart the download and came to some interesting conclusions. Writing on his blog Born's Tech and Windows World, he describes how the patch appears to be destined for Broadwell and Haswell chips and for "some hardware components." Tearing into an .inf file he found this description:

; ** Filename: AvotonUSB.inf **
; ** Abstract: Assigns the null driver to devices **
; ** for yellow-bang removal and **
; ** brands Intel(R) devices **

Born examined many of the files and concludes, "The .inf files for new CPU chip sets contains a list of device ids for drivers, needed to support the CPU chipset." He concludes that the drivers -- null drivers, which don't do anything -- are placeholders that define device IDs for various motherboard components, getting rid of the yellow "!" in Device Manager.

That seems innocuous enough, but it looks like the installer wipes out whatever device drivers may already exist. Born cites two examples:

  • I found a case here, where the optional update replaced an already installed and needed SMBus driver -- so the user was no more able to read its DIMM temperature, using Intel Desktop Utilities.
  • A 2nd incident has been reported as a user comment within my German blog post. The user reported, that his Wi-Fi adapter stalled after installing this optional update.

Bottom line: At best, installing this patch will remove some of the yellow bangs in Device Manager. At worst it'll break an already-good driver.

Avoid it.

Microsoft's ChakraCore adds WebAssembly support

Microsoft's ChakraCore adds WebAssembly support

Version 1.4 of the JavaScript engine core adds experimental support for the WebAssembly portable code format

ChakraCore, the open source core of the JavaScript engine powering Microsoft's Edge browser, has been upgraded with experimental support for the WebAssembly portable code format for browsers and cross-platform JIT compilation capabilities.

WebAssembly, which has been backed by browser vendors like Google, Microsoft, Mozilla, and Apple, is a highly touted portable bytecode technology intended to improve web performance.

The upgrade, known as version 1.4.0, also adds JIT support on Linux and MacOS, and out-of-process JIT support for Edge. This change adds support for hosts to optionally supply Chakra with an external process to act as a JIT server, which can support the running of any number of Chakra runtime clients.

Described as a "minor release" by a Microsoft representative, ChakraCore 1.4.0 also enables async functions by default and enhances time-travel debugging, which allows developers to look at faulting code within the full fidelity of the debugger with runtime context preserved. It works on a record-and-playback principle, with the record mode creating a trace file during execution that can then be played back.

For memory reduction, version 1.4.0 leverages function body redeferral. This covers redefer function bodies that are not currently being executed and are eligible for deferred parsing, Microsoft said.

JSRT (JavaScript Runtime) String APIs have been updated but are in an experimental phase; ChakraCore backs JSRT APIs for embedding ChakraCore in Applications. Version 1.4.0, also enables ShareArrayBuffer, once again under and experimental flag.

Obama White House’s final tech recommendation: Invest in AI

Obama White House’s final tech recommendation: Invest in AI

Potential negative impacts can be offset by investments in education as well as by ensuring there is a safety net to help affected people, the White House argues

One of the most important things that the U.S. can do to improve economic growth is to invest in artificial intelligence, or AI, said the White House, in a new report. But there’s a dark side to this assessment as well.

AI-driven, intelligent systems have the potential to displace millions, such as truck drivers, from their jobs. But potential negative impacts can be offset by investments in education as well as by ensuring there is a safety net to help affected people, the White House argued, in what will likely be the Obama administration’s final report on technology policy.

Some of the report’s recommendations, which include expanded unemployment help and access to healthcare, may be anathema to a Republican-controlled Congress with a focus on tax reductions and spending cuts. But this report—”Artificial Intelligence, Automation, and the Economy” (PDF)—which was in the works well before election day, also describes broader, technological-driven changes that will impact jobs and may pose issues for President-elect Donald Trump.

The “biggest concern” about AI “is that we won’t have enough of it—and we won’t have enough productivity growth,” said Jason Furman, chairman of the White House Council of Economic Advisers, on a telephone press briefing Tuesday. “Anything we can do to have more AI will contribute to more productivity growth and will help make possible more wage and income growth.”

The report argues that “advances in AI technology hold incredible potential to help the United States stay on the cutting edge of innovation,” and that the government “has an important role to play in advancing the AI field by investing in research and development.”

Furman previously put U.S. investment in AI research at $200 million a year, but private investments at $2.4 billion a year.

But if you read deeply into the White House report, the future society it describes may be a hostile and desperate one. It will be a society where intelligent machines move up the occupation ladder and the economic benefits go to those with the most skills—the “fortunate few,” and the owners of capital, or the top 0.01 percent of wealth. In this scenario, inequality grows.

Preventing a dismal outcome will take investment particularly in education, argued Furman. In earlier, manufacturing-related economic shifts, “we were making a really big investment to make sure that people could take advantage of the new types of jobs,” he said.

But in recent years, despite the IT revolution, productivity growth has slowed. The productivity growth rate was 2.5 percent after 1995 but slowed to 1.0 percent after 2005, the report says.

Furman blames this slowdown in productivity, in part, on a decline in education spending.

“We haven’t increased our investment in schooling in the way that we did in the 1930s, 40s and 50s,” said Furman. It’s “part of why we have seen an increase in inequity in the last couple of decades.”

A theme of this report is that “technology is not destiny,” meaning that good policy can offset or mitigate the impact of AI-driven change.

The White House acknowledges “substantial” uncertainties ahead in forecasting the type and rate of AI-driven change, but it warns that those changes can exceed the imaginable.

“There have long been fears that technology—the machines, the assembly lines, or the robots—would replace all human labor, but AI-driven automation has unique features that may allow it to replace substantial amounts of routine cognitive tasks in which humans previously maintained a stark comparative advantage,” the White House report notes.

A.I.-driven machines today have surpassed human performance for some specialized tasks such as image recognition, said Ed Felten, deputy CTO at the White House Office of Science and Technology Policy, who was also on the call.

But, said Felten, in the next 20 years “it’s unlikely that machines will exhibit anything that resembles the kind of general-purpose, human-like intelligence that we have.”

This story, "Obama White House’s final tech recommendation: Invest in AI" was originally published by Computerworld.

Devs will lead us to the big data payoff at last

Devs will lead us to the big data payoff at last

Enterprises have gotten little satisfaction from their early adventures in big data, so developers are charting their own course in the cloud

In 2011, McKinsey & Co. published a study trumpeting that "the use of big data will underpin new waves of productivity growth and consumer surplus" and called out five areas ripe for a big data bonanza. In personal location data, for example, McKinsey projected a $600 billion increase in economic surplus for consumers. In health care, $300 billion in additional annual value was waiting for that next Hadoop batch process to run.

Five years later, according to a follow-up McKinsey report, we're still waiting for the hype to be fulfilled. A big part of the problem, the report intones, is, well, us: "Developing the right business processes and building capabilities, including both data infrastructure and talent" is hard and mostly unrealized. All that work with Hadoop, Spark, Hive, Kafka, and so on has produced less benefit than we thought it would.

In part that's because keeping up with all that open source software and stitching it together is a full-time job in itself. But you can also blame the bugbear that stalks every enterprise: institutional inertia. Not to worry, though: The same developers who made open source the lingua franca of enterprise development are now making big data a reality through the public cloud.

Paltry big data progress

On the surface the numbers look pretty good. According to a recent SyncSort survey, a majority (62 percent) are looking to Hadoop for advanced/predictive analytics with data discovery and visualization (57 percent) also commanding attention.

Yet when you examine this investment more closely, a comparatively modest return emerges in the real world. By McKinsey's estimates, we're still falling short for a variety of reasons:

  • Location-based data has seen 50 to 60 percent of potential value captured, mainly because not everyone can afford a GPS-enabled smartphone
  • In U.S. retail, we're seeing 30 to 40 percent, due to a lack of analytical talent and an abundance of still-siloed data
  • Manufacturing comes in at 20 to 30 percent, again because data remains siloed in legacy IT systems and because management remains unconvinced that big data will drive big returns
  • U.S. health care limps along at a dismal 10 to 20 percent, beset by poor interoperability and data sharing, along with a paucity of proof that clinical utility will result
  • The E.U. public sector also lags at 10 to 20 percent, thanks to an analytics talent shortage and data siloed in various government agencies

These aren't the only areas measured by McKinsey, but they provide a good sampling of big data's impact across a range of industries. To date, that impact has been muted. This brings us to the most significant hole in big data's process: culture. As the report authors describe:

Adapting to an era of data-driven decision making is not always a simple proposition. Some companies have invested heavily in technology but have not yet changed their organizations so they can make the most of these investments. Many are struggling to develop the talent, business processes, and organizational muscle to capture real value from analytics.

Given that people are the primary problem holding up big data's progress, you could be forgiven for abandoning all hope.

Big data's cloudy future

Nonetheless, things may be getting better. For example, in a recent AtScale survey of more than 2,500 data professionals across 1,400 companies and 77 countries, roughly 20 percent of respondents reported clusters of more than 100 nodes, a full 74 percent of which are in production. This represents double-digit year-over-year growth.

It's even more encouraging to see where these nodes are running, which probably accounts for the increase in success rates. According to the same survey, more than half of respondents run their big data workloads in the cloud today and 72 percent plan to do so going forward. This aligns with anecdotal data from Gartner that interest in data lakes has mushroomed along with a propensity to build those lakes in public clouds.

This makes sense. Given that the very nature of data science -- asking questions of our data to glean insight -- requires a flexible approach, the infrastructure powering our big data workloads needs to enable this flexibility. In an interview, AWS product chief Matt Wood makes it clear that because "your resource mix is continually evolving, if you buy infrastructure it's almost immediately irrelevant to your business because it's frozen in time."

Infrastructure elasticity is imperative to successful big data projects. Apparently more and more enterprises got this memo and are building accordingly. Perhaps not surprising, this shift in culture isn't happening top-down; rather, it's a bottom-up, developer-driven phenomenon.

What should enterprises do? Ironically, it's more a matter of what they shouldn't do: obstruct developers. In short, the best way to ensure an enterprise gets the most from its data is to get out of the way of its developers. They're already taking advantage of the latest and greatest big data technologies in the cloud.

Google sued by employee for confidentiality policies that 'muzzle' staff

Google sued by employee for confidentiality policies that 'muzzle' staff

The product manager at Google says the company “must let the sun shine in”

A product manager at Google has sued the company for its allegedly illegal confidentiality agreements, policies. and practices that among other things prohibit employees from speaking even internally about illegal conduct and dangerous product defects for fear that such statements may be used in legal discovery during litigation, or sought by the government.

The alleged policies, which are said to violate California laws, restrict employees’ right to speak, work or whistle-blow, and include restrictions on speaking to the government, attorneys or the press about wrongdoing at Google or even “speaking to spouse or friends about whether they think their boss could do a better job,” according to a complaint Tuesday in the Superior Court of California for the city and county of San Francisco.

“The policies prohibit Googlers from using or disclosing all of the skills, knowledge, acquaintances, and overall experience at Google when working for a new employer,” according to the complaint, which alleges that the company’s confidentiality policies are contrary to the California Labor Code, public policy and the interests of the state.

Google’s Global Investigation Team “also relies on ‘volunteers’ to report other employees who might have disclosed any information” about the company, according to the complaint, which paints a picture that is in sharp contrast to the glowing image one usually gets about Google’s workplace culture and perks.

Under a program called Stopleaks, Google also asks employees to report on “strange things” around them such as anyone asking detailed questions about an employee’s project or job, according to the complaint. Employees are also said to be banned from writing creative fiction such as “a novel about someone working at a tech company in Silicon Valley,” without Google’s approving the book idea and the final draft.

The policies are said to be be intended to control Google’s former and current employees, limit competition, infringe on constitutional rights and block the reporting of misconduct. The complaint goes on to state that the case does not concern Google’s trade secrets, consumer privacy or information that should not be disclosed under the law, but reflects the company’s use of confidentiality and other policies for illegal and improper purposes.

In the lawsuit, first reported by The Information, the employee who has filed anonymously as John Doe, claims that Brian Katz, Google’s director of global investigations, intelligence and protective services, had falsely informed some 65,000 Google employees that the plaintiff was terminated for leaking information to the press, without naming him. Katz and Google used him as scapegoat to ensure that other employees continued to fall in line with the company’s confidentiality polices, according to the complaint, which asks that the employee should not be asked “to self-publish” his name.

Google could not be immediately reached for comment on the lawsuit after business hours. The company was quoted by some news outlets as saying in a statement that its “employee confidentiality requirements are designed to protect proprietary business information, while not preventing employees from disclosing information about terms and conditions of employment, or workplace concerns.”

In September this year, the employee had complained to the Labor Workforce and Development Agency, after which Google made an amendment in which it “purported to broaden Googler’s right to discuss pay, hours or other terms of employment and to communicate with government agencies regarding violations of the law,” according to the complaint.

Employees were not informed of the amendment and other policies were not changed, and “in fact, Google’s actual policies and practices remained unchanged,” it added.

The employee has asked the California court for penalties for each of the 12 alleged violations under the Private Attorneys General Act on behalf of himself, the state of California and other Google employees.

Apple reduces prices on USB-C adapters

Apple reduces prices on USB-C adapters

You still have to shell out a good amount of cash for adapters for the new MacBook Pro, but at least it’s not as much as before.

Update: Apple has extended the discount pricing on USB-C adapters to March 31, 2017. (This story originally posted on November 4, 2016.)

Apple on Friday made life a little more affordable for anyone buying a new MacBook Pro. The company has lower the prices for many of its USB-C adapters, which are necessary for users who want to connect their devices to the new laptop. But the price cuts only last until the end of this year.

The new MacBook Pro has Thunderbolt 3/USB-C ports, which have a different connector than that of USB-A devices and cables like the iPhone sync cable. That means you need to buy an adapter. Depending on your devices, you may need to buy several adapters.

“We recognize that many users, especially pros, rely on legacy connectors to get work done today and they face a transition,” Apple said in a statement issue to the media. “We want to help them move to the latest technology and peripherals, as well as accelerate the growth of this new ecosystem. Through the end of the year, we are reducing prices on all USB-C and Thunderbolt 3 peripherals we sell, as well as the prices on Apple’s USB-C adapters and cables.”

Some of the USB-C adapters in the Apple Store include:

  • USB-C to USB Adapter ($9; was $19)
  • USB-C to Lightning Cable (2m) ($29; was $35)
  • USB-C to Lightning Cable (1m) ($19; was $25)
  • Thunderbolt 3 (USB-C) to Thunderbolt 2 Adapter ($29; was $49)

Prices have also been cut on some non-Apple adapters that are available in the Apple Store:

  • Belkin 2.0 USB-C to USB-B Printer Cable ($14; was $20)
  • Belkin USB-C to Micro-B Cable (USB 3.1) ($22; was $30)
  • Belkin USB-C to VGA Adapter ($29; was $40)
  • Belkin USB-C to Gigabit ethernet Adapter ($26; was $35)
  • SanDisk Extreme Pro SD UHS-II Card USB-C Reader ($29; was $50)

Refer to our MacBook Pro Thunderbolt 3 adapter guide to figure out what adapters you need.

Lower prices on 4K and 5K LG displays

Adapters weren’t the only products with price cuts. Two LG displays also saw drops—significant ones, at that. Like the adapters, the prices reductions are good until the end of this year.

The Apple Store is now selling the LG UltraFine 5K Display for $974, which is down from $1,300. The LG UltraFine 4K Display is now $524; it used to be $700.

Last week, Apple confirmed that the company is no longer producing a stand-alone display. That means buyers will need to consider third-party display makers like LG.

This story, "Apple reduces prices on USB-C adapters" was originally published by Macworld.

Congressional report sides with Apple on encryption debate

Congressional report sides with Apple on encryption debate

The bipartisan panel advises Congress to look into using legal hacking methods to break into tech products

The U.S. is better off supporting strong encryption that trying to weaken it, according to a new congressional report that stands at odds with the FBI’s push to install backdoors into tech products.

On Tuesday, a bipartisan congressional panel published a year-end report, advising the U.S. to explore other solutions to the encryption debate.

“Any measure that weakens encryption works against the national interest,” the report said.

The congressional panel formed back in March, amid the FBI’s public battle with Apple over trying to gain access to a locked iPhone belonging to the San Bernardino shooter.

Tuesday’s report essentially sides with Apple and its stance that strong encryption is vital for security. But the report also acknowledged that the technology has become an obstacle for law enforcement agencies when investigating crimes.

However, forcing U.S. companies to compromise their encryption wouldn’t necessarily solve the problem. Consumers and bad actors, for instance, would likely choose to use more secure products offered by foreign companies, the report said.

“Congress cannot stop bad actors — at home or overseas — from adopting encryption,” the report added.

Lobbying groups from the tech sector welcomed Tuesday’s report. The Computer and Communications Industry Association said weakening encryption would be “shortsighted.”

“(It) would play directly into the hands of those who would do us harm,” said association president Ed Black in an email.

Tuesday’s report advises that congressional committees explore other measures to help law enforcement agencies with their investigations. Among the suggestions was examining the use of “legal hacking” to break into tech products.

Rather than build backdoors into tech products, law enforcement can consider exploiting flaws in secure products that already exist, the report said.

The FBI resorted to this approach when it hired an unknown third-party to hack into the passcode-protected iPhone from the San Bernardino shooter. The agency’s director has suggested the FBI paid more than $1 million for the hacking tool involved.

However, any legal hacking would raise other questions, like if and when a law enforcement agency should alert tech companies about these vulnerabilities, the report said.

News agencies have already sued the FBI, demanding details over how it gained access to the San Bernardino shooter's iPhone.

Other measures Congress can explore include legally compelling criminal suspects to unlock their smartphones and finding better ways to use metadata analysis in law enforcement investigations.

Tuesday's report also emphasized the need for both the tech industry and law enforcement to foster cooperation, despite past tensions between the two sides.

"This can no longer be an isolated or binary debate. There is no 'us versus them,'" the report said.

The FBI didn’t immediately respond to a request for comment.

LG’s 4K and 5K UltraFine displays now for sale at the Apple online store

LG’s 4K and 5K UltraFine displays now for sale at the Apple online store

You can buy one now, but you’ll have to wait a while to actually get it delivered.

Back in June, Apple announced that it discontinued its Thunderbolt Display, leaving Mac users wondering about their display choices. The picture became clearer in October, when Apple announced that it worked with LG on an UltraFine 4K and 5K display, but they wouldn’t be available until December. Well, we’re well into December, and the day’s finally here: both the LG UltraFine 4K and UltraFine 5K display are available for purchase on the online Apple store.

The displays are being offered with “special pricing” for a limited time. The 27-inch UltraFine 5K is $974. When the special offer ends on March 31, 2017, the price could climb up to the original $1,300. The 21.5-inch UltraFine 4K display is $524, down from the original $700.

According to the Apple website, if you order a display, you won’t get it right away. As of this writing, estimated delivery time for the UltraFine 4K is 5 to 6 weeks. You won’t have to wait as long for the UltraFine 5K; it’s deliver time is 2 to 4 weeks. You might try your luck by checking availability at your local Apple store. When I checked, the Apple website said I could pick up an UltraFine 4K today at the store nearest my home, but the UltraFine 5K display won’t be available from any local Apple store until January 20.

The UltraFine 5K display has a 5120x2880 resolution, while the UltraFine 4K display’s resolution is 4096x2304. Both displays support the P3 color gamut and have a brightness of 500 cd/m².

Both displays use a single Thunderbolt 3 cable to connect to your MacBook or MacBook Pro. In addition to video, the single connection can charge your laptop, as well as feed audio from the laptop to the display’s stereo speakers. The displays also have three USB-C ports for connecting peripherals, and built-in cameras that can be used with FaceTime.

This story, "LG’s 4K and 5K UltraFine displays now for sale at the Apple online store" was originally published by Macworld.

Mark Zuckerberg checks in on his year building A.I. for his home

Mark Zuckerberg checks in on his year building A.I. for his home

Challenges and accomplishments for setting up A.I. to answer his door, entertain his daughter

Mark Zuckerberg's personal 2016 challenge to build an artificial intelligence system to run his home has been a learning experience for the co-founder and CEO of Facebook.

Some parts were simpler than expected and others were a surprising challenge, Zuckerberg said in a blog post.

"My goal was to learn about the state of artificial intelligence -- where we're further along than people realize and where we're still a long ways off," Zuckerberg wrote. "These challenges always lead me to learn more than I expected, and this one also gave me a better sense of all the internal technology Facebook engineers get to use, as well as a thorough overview of home automation."

Zuckerberg's personal home A.I. challenge is one of a string of New Year's resolutions that he has made for himself.

Over the past few years, he's challenged himself to learn to speak Mandarin, read two books a month, and meet a new person every day.

a.i. facebook zuckerberg Facebook

Mark Zuckerberg's home A.I. system is set up to recognize friends and family at the front door. It will let them in and notify Zuckerberg that someone has arrived.

Inspired by Jarvis, the home computer system in the Iron Man comics and movies, Zuckerberg wrote in a blog post last January that for 2016 he was going to focus on using A.I. to run his home and help him with his work.

At the time, Zeus Kerravala, an analyst with ZK Research, said he thought Zuckerberg's focus on A.I. could spur other researchers to do the same.

Today, though, Kerravala said that doesn't seem to have happened.

"if you were working on A.I., it's unlikely that Zuckerberg doing it got you more interested," he added. "If you weren't, then it's unlikely it caused you to jump in."

However, Kerravala still is happy that A.I. was Zuckerberg's focus for the year.

"Enough small advancements in A.I. will mean a big leap one day," he said. "More leaders and companies should be focused on moon shots. I think he's doing what he should be doing. "

So here's what Zuckerberg said he accomplished this past year.

He reported in his post that he built a simple A.I. system that -- by using natural language processing, speech, facial recognition, and reinforcement learning -- can control his home's lights, temperature, security, music, and appliances.

The system, written in Python, Objective C, and PHP, is able to learn new words and concepts, he added, noting that it's also able to entertain his daughter Max and play Mandarin lessons for her. His system is also named Jarvis.

He used facial and image recognition to enable the system to detect if his daughter is awake and moving around in her crib, if that's the dog in the living room or if it's a rug, and if a friend or relative is at the door or if it's a stranger.

"About one-third of the human brain is dedicated to vision, and there are many important A.I. problems related to understanding what is happening in images and videos," Zuckerberg wrote. "Face recognition is a particularly difficult version of object recognition because most people look relatively similar compared to telling apart two random objects -- for example, a sandwich and a house. But Facebook has gotten very good at face recognition for identifying when your friends are in your photos. That expertise is also useful when your friends are at your door and your A.I. needs to determine whether to let them in."

To figure out who's at his door and possibly let them into his house, Zuckerberg said he installed a few cameras to get images of his visitors from different angles, along with a server to monitor the cameras and run facial recognition and check a list of people allowed entry to his home. The system also tells him when a guest has been let in.

"This type of visual A.I. system is useful for a number of things, including knowing when Max is awake so it can start playing music or a Mandarin lesson, or solving the context problem of knowing which room in the house we're in so the A.I. can correctly respond to context-free requests like 'turn the lights on' without providing a location," Zuckerberg wrote.

"Like most aspects of this A.I., vision is most useful when it informs a broader model of the world, connected with other abilities like knowing who your friends are and how to open the door when they're here. The more context the system has, the smarter is gets overall."

Zuckerberg noted that he was disappointed that some of his appliances aren't smart and connected, and the ones that are use different languages and protocols. This made coding his A.I. system more difficult.

One positive, though, is that he was able to use a Messenger bot to communicate with Jarvis.

"I programmed Jarvis on my computer, but in order to be useful I wanted to be able to communicate with it from anywhere I happened to be," he wrote. "That meant the communication had to happen through my phone, not a device placed in my home."

He used the Messenger bot because it was easier than building a separate app.

"I can text anything to my Jarvis bot, and it will instantly be relayed to my Jarvis server and processed," he added. "I can also send audio clips and the server can translate them into text and then execute those commands. In the middle of the day, if someone arrives at my home, Jarvis can text me an image and tell me who's there, or it can text me when I need to go do something."

Though Zuckerberg's annual year challenge is just about over, he noted in his blog that he'll continue working on Jarvis, including building an Android app, setting up Jarvis voice terminals in more rooms, and connecting more appliances.

"In the longer term, I'd like to explore teaching Jarvis how to learn new skills itself rather than me having to teach it how to perform specific tasks," he said. "If I spent another year on this challenge, I'd focus more on learning how learning works."

This story, "Mark Zuckerberg checks in on his year building A.I. for his home" was originally published by Computerworld.

Sony Xperia XA successor rendered: Type-C port, slim bezels

Sony Xperia XA successor rendered: Type-C port, slim bezels

Sony Xperia XA successor rendered: Type-C port, slim bezels

Already quite the looker, the Xperia XA may be in for a successor to bring its design up to date with the latest Xperia XZ flagship. This alleged new model, with a name yet to be determined, has gotten some unofficial renders, complete with precise dimensions.

If the source turns out to be on the money, the future 5-incher will keep this year's model's tight side bezels - measuring 145 x 66.8 x 7.99mm, it's precisely as wide as the current XA. It may have gotten a millimeter and a half taller, and 0.1mm thicker but who's going to notice.

Renders of the alleged Sony Xperia XA successor Renders of the alleged Sony Xperia XA successor Renders of the alleged Sony Xperia XA successor Renders of the alleged Sony Xperia XA successor
Renders of the alleged Sony Xperia XA successor

Another sign Sony is keeping up with the times even in the midrange is the UST Type-C port - that would make it the third phone by the company to employ the symmetrical port after the Xperia XZ and Xperia X Compact. It's good to see a 3.5mm jack in the renders as well, but there's no fingerprint sensor to be found.

As for looks, this potential next XA gets the flat top and bottom plates of the XZ, and the gentle curves on the sides of the back. Whether this will be its final form remains to be seen.

Motorola Moto Z Play running Nougat receives WiFi certification

Motorola Moto Z Play running Nougat receives WiFi certification

Motorola Moto Z Play running Nougat receives WiFi certification

Looks like Motorola's Moto Z Play smartphone is all set to receive the Android Nougat update. Model XT1635-02, running Android version 7.0, has received Wi-Fi certification from the WiFi Alliance (WFA).

While there is currently no information on exactly when the roll out will begin, given that the Nougat-powered model of the handset has started collecting the required certifications, we expect that to happen sooner than later.

This comes just a few days after the Moto X Play running Android Nougat (version 7.1) was spotted on benchmarking website GFXbench. The Nougat update has been already rolled out by Motorola to its Moto G4, G4 Plus, Moto Z, and Z Force smartphones.

alcatel IDOL 4S with Windows 10 to soon be available in Europe

alcatel IDOL 4S with Windows 10 to soon be available in Europe

alcatel IDOL 4S with Windows 10 to soon be available in Europe

The Windows 10-powered alcatel IDOL 4S, which is currently only available in the US (through T-Mobile), might soon hit Europe as well. A tweet from the company's French subsidiary says select European countries, including France will be getting it.

No word on pricing though. Also, It's worth noting that unlike alcatel's other country-specific Twitter accounts, the French account isn't verified, although it does seem to be run by an official representative.

Anyway, coming back to the smartphone, it's powered by a Snapdragon 820 SoC and sports a 5.5-inch full HD display. RAM is 4GB, while internal memory is 64GB. The handset features a 21MP/8MP camera combo, and packs in a 3,000mAh battery. There also support for Windows Continuum, Windows Hello as well as VR.

Samsung Galaxy A5 (2017) specs leak once again

Samsung Galaxy A5 (2017) specs leak once again

Samsung Galaxy A5 (2017) specs leak once again

An alleged update of Samsung's Galaxy A5 model, predictably named the A5 (2017) has been circulating the rumor mill for some time now. The handset was first spotted in an early benchmark back in August, followed by a few renders, leaks and even an alleged beta unit review on video.

Since the previous Galaxy A5 (2016) was unveiled in December, it is likely the 2017 iteration will hit shelves soon as well. We already have a pretty good idea on what to expect in terms of specs and a fresh new leak from an Arabic source offers yet another confirmation of some of the hardware.

The Samsung Galaxy A5 (2017) will be equipped with a 5.2-inch, FullHD panel, with a Gorilla Glass 4 on top. The display will almost definitely utilize Samsung's signature Super AMOLED technology. However, the form of the panel is still a debated point. The source clearly list a 2.5D finish, which implies a small curvature near the edges of the glass. However, there are rumors floating around of a full-on double curved panel. There is little info to support such claim, apart from some suspiciously modified renders. Thus, we are fairly confident that a 2.5D finish is all the curvature we cab expect on the front of the A5 (2017).

Galaxy A5 (2017) renders Galaxy A5 (2017) renders
Galaxy A5 (2017) renders

Moving on to internals, the chipset is also subject to some speculation. Sadly, the new leak provides no info on the matter. Going by past leaks, we can expect an Exynos 7870, or perhaps an updated version, dubbed the Exynos 7880. In any case, the efficient 14nm process should yield impressive battery life for the suggested 3,000 mAh pack.

Working alongside the aforementioned chip should be 3GB of RAM and 32GB of expandable storage. As for the camera department, we are likely looking at two 16MP shooters with an F/1.9 aperture - one on the front and one on the back. Other notable suggested features include an aluminium frame with 3D glass back design, a fingerprint reader, USB Type-C connector, Dual SIM support and some form of water resistance certification.

Even more Galaxy A5 (2017) renders Even more Galaxy A5 (2017) renders
Even more Galaxy A5 (2017) renders

Hopefully, Samsung will take the wrapping off the new Galaxy A5 (2017) in time for the holidays. If not, an early January 2017 release seems most likely.

December security update hitting Samsung Galaxy S7/S7 edge units in India as well

December security update hitting Samsung Galaxy S7/S7 edge units in India as well

December security update hitting Samsung Galaxy S7/S7 edge units in India as well

Samsung has started pushing out a new update to the Galaxy S7 and S7 edge smartphones in India. Weighing in at around 270MB, the update brings along Android security fixes for the month of December.

There's currently no information on what other changes (if any) are included in the update.

This comes nearly a week after Galaxy S7 edge units in Australia started receiving the update. At that time, there were no reports of Australian Galaxy S7 receiving the update, but that's likely to have changed in the past week.

Coming back to the update in India, it may take some time before you see a notification on your device. If you feel impatient, you can manually check for the update by heading to your handset's Settings menu.

Samsung Galaxy C9 Pro is now available in black

Samsung Galaxy C9 Pro is now available in black

Samsung Galaxy C9 Pro is now available in black

If you are looking for a good Samsung phablet to pick up, to say, fill the void left behind by the Galaxy Note7, the S7 edge is definitely a no-brainer. However, if you don't really mind sacrificing a few things, like the curved panel and a little bit of performance, the Galaxy C9 Pro is a prime candidate.

After quite a few photo leaks and online sightings, today, the Korean giant finally released the black variant of the handset, arguably making it even more appealing.

Samsung Galaxy C9 Pro in black Samsung Galaxy C9 Pro in black Samsung Galaxy C9 Pro in black Samsung Galaxy C9 Pro in black Samsung Galaxy C9 Pro in black
Samsung Galaxy C9 Pro in black

As previously mentioned, nothing was changed in the styling or internals of the smartphone, apart from the new finish. However, that should not be underestimated, as the only other alternative colors are Gold and a different shade of gold - Rose Gold. Just like with the recently released Black Pearl Samsung Galaxy S7 edge, this should broaden the model's appeal.

However, before you get too excited and go looking for the new paintjob online, it appears it is only official in Korea as of yet. The question of international availability is still unclear, but we really hope Samsung treats the rest of its markets as well.

As for a quick specs summary - the Galaxy C9 Pro comes with a 6-inch 1080p Super AMOLED panel, Snapdragon 653 SoC, 6GB of RAM and 64GB of expandable storage. Its camera setup consists of a 16MP, f/1.9 main shooter and another 16MP one on the front. It is a dual-SIM device, boots Android 6.0 Marshmallow and is powered by a 4,000 mAh battery.

Source (in Korean)

Samsung's investigation of the Note7 debacle complete, findings report sent out to labs

Samsung's investigation of the Note7 debacle complete, findings report sent out to labs

Samsung's investigation of the Note7 debacle complete, findings report sent out to labs

We all know the sad story of the exploding Samsung Galaxy Note7 that lived a very short life and got recalled and discontinued due to safety concerns.

Initially, Samsung had blamed a battery supplier for the issue, but as it later turned out that wasn't the case and the company struggled to find the exact cause. Samsung committed to investigating and revealing the cause of the issue by the end of the year, and it seems that the Korean maker is set to make good on that promise.

The internal investigation is apparently complete and Samsung has sent the findings report to the Korea Testing Laboratory and UL (an American safety organization), among others. The report is, however, yet to be released to the general public (us included).

Meanwhile, Instrumental suggested that extremely tight internal margins were the reason for the exploding batteries, but that remains to be confirmed by Samsung's own analysis.

BlackBerry branded smartphones will be made and sold by TCL from now on

BlackBerry branded smartphones will be made and sold by TCL from now on

BlackBerry branded smartphones will be made and sold by TCL from now on

Back in September BlackBerry first announced that it would stop designing and building phones itself. The Canadian company said it would rely on "hardware partners" going forward, and these companies would be the ones making and selling BlackBerry branded phones in the future.

Today BlackBerry is basically making that partner official. The company in question is TCL, the Chinese multinational electronics corporation that also sells mobile devices under the Alcatel brand. This move is anything but surprising, since TCL and BlackBerry have already cooperated on two smartphones, namely the DTEK50 and DTEK60.

BlackBerry DTEK50 by TCL

Both of those have been rebranded Alcatel designs, so perhaps we should expect more of this strategy in the future. Regardless, TCL will "design, manufacture, sell and provide customer support for BlackBerry-branded mobile devices", according to the official press release. BlackBerry on the other hand "will license its security software and service suite, as well as related brand assets to TCL Communication".

The document goes on to state that "BlackBerry will continue to control and develop its security and software solutions, serve its customers and maintain trusted BlackBerry security software, while TCL Communication will manage all sales and distribution and serve as a global distributor of new BlackBerry-branded mobile devices along with dedicated sales teams".

TCL will be the exclusive global manufacturer and distributor for all upcoming BlackBerry-branded smartphones, with the exception of a few markets, namely India, Sri Lanka, Nepal, Bangladesh, and Indonesia. It's unclear what BlackBerry has planned for those countries, but perhaps a different hardware partner will take care of building and selling phones over there.

Motorola Moto M lands in Europe

Motorola Moto M lands in Europe

Motorola Moto M lands in Europe

The Motorola Moto M - which was made official last month, and is currently only available in a couple of Asian countries (India and China) - has now landed in Europe as well.

Specifically, the phone has been launched in Slovakia carrying a €279 (around $290) price tag. It is expected to hit other European markets soon.

Specs-wise, the device is powered by Helio P10 chipset (Helio P15 in India) and sports a 5.5-inch full HD display. It comes in 3GB/32GB and 4GB/64GB memory options, features 16MP/8MP camera combo, and packs in a 3,050mAh battery. The device runs Android 6.0.1 Marshmallow out of the box.

Sony schedules its CES 2017 press conference

Sony schedules its CES 2017 press conference

Sony schedules its CES 2017 press conference

Sony has scheduled its press conference for CES - the tech bonanza officially starts January 5, but an impatient Sony will reveal its new gadgets on January 4 (Wednesday).

The event starts at 5:00pm local time in Las Vegas and if you can’t attend, you can at least watch the livestream. And if you are there, make sure to visit the Sony Booth (17300) at the Las Vegas Convention Center.

What can we expect? Well, two Sony Xperia phones - G3121 and 3112 - will be shown off according to rumors, and despite early impressions, they will be powered by Snapdragon 820. Maybe we’ll get a progress report on the Nougat update for current Xperias. Also, chances are we’ll be hearing from the PlayStation division as well.

Nougat update confirmed for Xiaomi's Mi Note, Mi 4c, Mi 4s, and Mi Max smartphones

Nougat update confirmed for Xiaomi's Mi Note, Mi 4c, Mi 4s, and Mi Max smartphones

Nougat update confirmed for Xiaomi's Mi Note, Mi 4c, Mi 4s, and Mi Max smartphones

After rolling out the Nougat update to its Mi 5 smartphone last week. Xiaomi has now confirmed some more of its phones that'll get the update. The confirmation came in the form of a Weibo post from a company representative.

The list includes the Mi 4c, Mi 4s, Mi Note, and Mi Max. There's currently no information on exactly when the update will be rolled out, although reports say that might happen sometime in the first quarter of 2017. Other Xiaomi devices that'll get the update include the Mi 5s, Mi Note 2, Mi Mix.

Samsung Galaxy S8 to come with a rear-mounted fingerprint sensor

Samsung Galaxy S8 to come with a rear-mounted fingerprint sensor

Samsung Galaxy S8 to come with a rear-mounted fingerprint sensor

In today's episode number 'who's still counting' of the Galaxy S8 rumor saga, it is suggested that a rear-mounted fingerprint sensor will make it to the next-gen Samsung flagship. The reason cited is that in initial testing of the fingerprint sensor embedded in the display has proven inaccurate, and a dedicated sensor is still needed.

Why, what's wrong with the Home button, you ask? There will be no Home 'button', strictly speaking, as multiple sources have already stated, and the entire front of the phone will be mostly display. That's why when Synaptics (a Samsung component supplier) announced earlier this week the industry-first optical fingerprint scanner, everyone expected the S8 to mark its debut. Well, this source says 'no'.

Samsung Galaxy S7

Additionally, the Galaxy S8 will come with the iris recognition tech from the ill-fated Note7 that uses a dedicated camera for the purpose, unlike the LG Innotek implementation where the two are fitted in a single module. Iris recognition, however, is still not as popular as fingerprint scanning, so let's not be too courageous by getting rid of the latter just yet, Samsung must have thought.

We'll file this in the 'plausible' category, but obviously there's no way of knowing for sure just yet.

Attackers use hacked home routers to hit Russia's 5 largest banks

Attackers use hacked home routers to hit Russia's 5 largest banks

The routers were likely hacked through a recent vulnerability in the TR-069 management protocol

Botnets made up of hacked home routers were used to launch distributed denial-of-service attacks against the five largest financial organizations in Russia.

The attacks occurred on Monday, Dec. 5, and were detected and mitigated by Rostelecom, Russia’s state-owned telecommunications company. The attacks peaked at 3.2 million packets per second (Mpps) and the longest attack lasted for over two hours, Rostelecom reported Friday.

The company did not provide a bandwidth measurement for the attacks, but 3.2Mpps is not that much. DDoS mitigation providers regularly see attacks that exceed 100 Mpps and a very large September attack against the website of cybersecurity blogger Brian Krebs peaked at 665Gbps and 143Mpps.

This week’s DDoS attacks against the Russian banks used the TCP SYN flood technique and originated from hacked home routers, according to Muslim Medzhlumov, director of Rostelecom’s cybersecurity center.

A common trait for these routers is that all of them were using the CPE WAN Management Protocol (CWMP), also known TR-069. This is a protocol used by ISPs to remotely manage routers installed in their customers’ homes.

A vulnerability was recently found in the TR-069 implementation from routers handed out to users by ISPs in multiple countries, including Deutsche Telekom in Germany, Eir in Ireland and TalkTalk in the U.K. Attackers quickly took advantage of the flaw to infect thousands of devices with malware and it’s very likely that some of them were used to launch the attacks against the Russian banks.

Last Friday, the Russian Federal Security Service, the FSB, said that it foiled a large-scale cyberattack planned by a foreign intelligence service that aimed to destabilize the country’s financial system.

The attack was planned for Dec. 5, according to the FSB, and would have included spreading fake claims about a crisis in the country’s financial system via social media and text messages. It’s not clear whether DDoS was also part of the plan and if the attacks mitigated by Rostelecom are related to the foiled campaign.

DDoS attacks against banks are not unusual. In 2012, crippling DDoS attacks disrupted the online services of multiple banks in the U.S. In July 2015, three banks in the U.K. suffered similar disruptions. According to the FBI, financial institutions regularly receive extortion emails from hackers threatening to disrupt their services.

Obama orders review of election hacks as Trump doubts Russia's role

Obama orders review of election hacks as Trump doubts Russia's role

The review is scheduled to be completed before Obama leaves office

President Barack Obama has ordered U.S. intelligence agencies to conduct a full review of the cyberattacks that allegedly tried to disrupt this year’s election, as his successor Donald Trump casts doubt over Russia’s possible involvement. 

Obama’s homeland security advisor Lisa Monaco first mentioned the need for the review while speaking to reporters on Friday morning, according to Politico.

“We may be crossed into a new threshold, and it is incumbent upon us to take stock of that, to review, to conduct some after-action, to understand what this means, and to impart those lessons learned,” Monaco reportedly said.

The review is scheduled to be completed before Obama leaves office on Jan. 20. It will produce a report that will be shared with members of Congress, some of whom have already been calling for a wider investigation.

On Friday, Obama deputy press secretary Eric Schultz said the review will also look into election hacking activities that took place prior to this year’s presidential race and will also go back to 2008. The president intends to make as much of the report as public as possible.

In October, U.S. intelligence agencies publicly blamed the Russian government for sponsoring high-profile hacks against U.S. political targets as a way to interfere with the election. However, the intelligence agencies didn’t provide specific evidence to support their claims.

Among the hacks was a high-profile breach at the Democratic National Committee that some security firms blamed on elite Russian cyberespionage teams. Sensitive files from the DNC were stolen as part of that hack and then leaked online, potentially damaging presidential candidate Hillary Clinton’s reputation. 

Russian hackers also allegedly stole emails from a Clinton aide that were later published by WikiLeaks just weeks before Election Day. 

Russia has denied any involvement. But that hasn’t stopped U.S. lawmakers from drafting legislation that would form a bipartisan commission to investigate the Russian government’s possible role in the hacks.

The Obama administration has also considered retaliating against Russia for the alleged cyberattacks. However, President-elect Donald Trump voiced doubts about the Russian government’s involvement.

“It could be Russia. And it could be China. And it could be some guy in his home in New Jersey,” Trump said in an interview with Time magazine conducted in late November.

On Friday, Representative Adam Schiff, California Democrat, said he approved the Obama’s administration move asking for a full review of the hacking. He also called Trump’s denial of Russian involved “disturbing,” and said the U.S. needed to respond to the Kremlin’s cyber meddling. 

“After many briefings by our intelligence community, it is clear to me that the Russians hacked our democratic institutions and sought to interfere in our elections and sow discord,” Schiff said in a statement.

John Bambenek, a researcher at security firm Fidelis Cybersecurity, said the goal of Obama’s review was to probably prevent Trump from further casting doubt on Russia’s alleged involvement in the hacks.

“It does seem that President Obama wants to make a strong case that Russia was involved and potentially box President-elect Trump in on deflecting blame,” Bambenek said in an email.

OpenVPN will be audited for security flaws

OpenVPN will be audited for security flaws

Cryptographer Matthew Green will analyze the popular software for flaws

The next major version of OpenVPN, one of the most widely used virtual private networking technologies, will be audited by a well-known cryptography expert.

The audit will be fully funded by Private Internet Access (PIA), a popular VPN service provider that uses OpenVPN for its business. The company has contracted cryptography engineering expert Matthew Green, a professor at Johns Hopkins University in Baltimore, to carry out the evaluation with the goal of identifying any vulnerabilities in the code.

Green has experience in auditing encryption software, being one of the founders of the Open Crypto Audit Project, which organized a detailed analysis of TrueCrypt, a popular open-source full-disk encryption application. TrueCrypt has been abandoned by its original developers in 2014, but its code has since been forked and improved as part of other projects.

Green will evaluate OpenVPN 2.4, which is currently the release candidate for the next major stable version. For  now, he will look for vulnerabilities in the source code that’s available on GitHub, but he will compare his results with the final version when released in order to complete the audit.

Any issues that are found will be shared with the OpenVPN developers and the results of the audit will only be made public after they have been patched, PIA’s Caleb Chen said in a blog post.

“Instead of going for a crowdfunded approach, Private Internet Access has elected to fund the entirety of the OpenVPN 2.4 audit ourselves because of the integral nature of OpenVPN to both the privacy community as a whole and our own company,” Chen said.

The OpenVPN software is cross-platform and can be used both in server or client modes. It’s therefore used by end-users to connect to VPN servers and by companies to set up such servers. The software is also integrated in commercial consumer and business products.

Kategori

Kategori