Feed aggregator

America's Cars Are Suddenly Getting Faster and More Efficient

Slashdot -

Kyle Stock and David Ingold, writing for Bloomberg: Sometime in the next couple of months, the Dodge Challenger SRT Demon and its 808 horsepower will show up in dealership windows like some kind of tiny, red, tire-melting factory. Yes, 808 horsepower. There's no typo. Last year, U.S. drivers on the hunt for more than 600 horsepower had 18 models to choose from, including a Cadillac sedan that looks more swanky than angry. Meanwhile, even boring commuter sedans are posting power specifications that would have been unheard of during the Ford Administration. The horses in the auto industry are running free. We crunched four decades of data from the Environmental Protection Agency's emission tests and arrived at a simple conclusion: All of the cars these days are fast and furious -- even the trucks. If a 1976 driver were to somehow get his hands on a car from 2017, he'd be at grave risk of whiplash. Since those days, horsepower in the U.S. has almost doubled, with the median model climbing from 145 to 283 stallions. Not surprisingly, the entire U.S. fleet grew more game for a drag-race: The median time it took for a vehicle to go from 0 to 60 miles per hour was halved, from almost 14 seconds to seven.

Read more of this story at Slashdot.

2B Pages On Web Now Use Google's AMP, Pages Now Load Twice As Fast

Slashdot -

At its developer conference I/O 2017 this week, Google also shared an update on its fast-loading Accelerated Mobile Pages (AMP). The company says that over 900,000 domains on the web have enabled AMP, and over two billion pages now load faster because of it. Taking things forward, Google says AMP access from Google Search is now twice as fast. From a report: Google first unveiled the open source AMP Project in October 2015. Since then, the company has been working hard to add new features and push AMP across not just its own products, but the larger web. Google Search only launched AMP support out of developer preview in September 2016. Eight months later, Google has already cut the time it takes to render content in half. The company explains that this is possible due to several key optimizations made to the Google AMP Cache. These include server-side rendering of AMP components and reducing bandwidth usage from images by 50 percent without affecting the perceived quality. Also helpful was the Brotli compression algorithm, which made it possible to reduce document size by an additional 10 percent in supported browsers (even Edge uses it). Google open-sourced Brotli in September 2015 and considers it a successor to the Zopfli algorithm.

Read more of this story at Slashdot.

ReactOS 0.4.5 Released

Slashdot -

An anonymous reader shares Colin Finck's forum post announcing ReactOS version 0.4.5: The ReactOS Project is pleased to release version 0.4.5 as a continuation of its three month cadence. Beyond the usual range of bug fixes and syncs with external dependencies, a fair amount of effort has gone into the graphical subsystem. Thanks to the work of Katayama Hirofumi and Mark Jansen, ReactOS now better serves requests for fonts and font metrics, leading to an improved rendering of applications and a more pleasant user experience. Your continued donations have also funded a contract for Giannis Adamopoulos to fix every last quirk in our theming components. The merits of this work can be seen in ReactOS 0.4.5, which comes with a smoother themed user interface and the future promises to bring even more improvements. In another funded effort, Hermes Belusca-Maito has got MS Office 2010 to run under ReactOS, another application from the list of most voted apps. On top of this, there have been several major fixes in the kernel and drivers that should lead to stability improvements on real hardware and on long-running machines. The general notes, tests, and changelog for the release can be found at their respective links. ISO images and prepared VMs for testing can be downloaded here.

Read more of this story at Slashdot.

ThinkShout: Skipping a Version - Migrating from Drupal 6 to Drupal 8 with Drupal Migrate

Drupal Planet -

I recently had the opportunity to migrate content from a Drupal 6 site to a Drupal 8 site. This was especially interesting for me as I hadn’t used Drupal 6 before. As you’d expect, there are some major infrastructure changes between Drupal 6 and Drupal 8. Those differences introduce some migration challenges that I’d like to share.

The Migrate module is a wonderful thing. The vast majority of node-based content can be migrated into a Drupal 8 site with minimal effort, and for the content that doesn’t quite fit, there are custom migration sources. A custom migration source is a small class that can provide extra data to your migration in the form of source fields. Typically, a migration will map source fields to destination fields, expecting the fields to exist on both the source node type and destination node type. We actually published an in-depth, two-part blog series about how we use Drupal Migrate to populate Drupal sites with content in conjunction with Google Sheets in our own projects.

In the following example, we are migrating the value of content_field_text_author from Drupal 6 to field_author in Drupal 8. These two fields map one-to-one:

id: book label: Book migration_group: d6 deriver: Drupal\node\Plugin\migrate\D6NodeDeriver source: key: migrate target: d6 plugin: d6_node node_type: book process: field_author: content_field_text_author destination: plugin: entity:node

This field mapping works because content_field_text_author is a table in the Drupal 6 database and is recognized by the Migrate module as a field. Everyone is happy.

However, in Drupal 6, it’s possible for a field to exist only in the database table of the node type. These tables look like this:

mysql> DESC content_type_book; +----------------------------+------------------+------+-----+---------+-------+ | Field | Type | Null | Key | Default | Extra | +----------------------------+------------------+------+-----+---------+-------+ | vid | int(10) unsigned | NO | PRI | 0 | | | nid | int(10) unsigned | NO | MUL | 0 | | | field_text_issue_value | longtext | YES | | NULL | | +----------------------------+------------------+------+-----+---------+-------+

If we want to migrate the content of field_text_issue_value to Drupal 8, we need to use a custom migration source.

Custom migration sources are PHP classes that live in the src/Plugin/migrate/source directory of your module. For example, you may have a PHP file located at src/Plugin/migrate/source/BookNode.php that would provide custom source fields for a Book content type.

A simple source looks like this:

namespace Drupal\custom_migrate_d6\Plugin\migrate\source; use Drupal\node\Plugin\migrate\source\d6\Node; /** * @MigrateSource( * id = "d6_book_node", * ) */ class BookNode extends Node { /** * @inheritdoc */ public function query() { $query = parent::query(); $query->join('content_type_book', 'book', 'n.nid = book.nid'); $query->addField('book', 'field_text_issue_value'); return $query; } }

As you can see, we are using our migration source to modify the query the Migrate module uses to retrieve the data to be migrated. Our modification extracts the field_text_issue_value column of the book content type table and provides it to the migration as a source field.

To use this migration source, we need to make one minor change to change to our migration. We replace this:

plugin: d6_node

With this:

plugin: d6_book_node

We do this because our migration source extends the standard Drupal 6 node migration source in order to add our custom source field.

The migration now contains two source fields and looks like this:

id: book label: Book migration_group: d6 deriver: Drupal\node\Plugin\migrate\D6NodeDeriver source: key: migrate target: d6 plugin: d6_book_node node_type: book process: field_author: content_field_text_author field_issue: field_text_issue_value destination: plugin: entity:node

You’ll find you can do a lot with custom migration sources, and this is especially useful with legacy versions of Drupal where you’ll have to fudge data at least a little bit. So if the Migrate module isn’t doing it for you, you’ll always have the option to step in and give it a little push.

Error'd: The Maybe Compiler

The Daily WTF -

"Maybe it it compiled...maybe it didn't. I guess that I have to find out myself from here on out," wrote, Y. Diomidov.

 

Ken W. writes, "Does buying the 4 year warranty mean Amazon will replace any steaks that don't last?"

 

"I *KNEW* that I should have gone for the blue one!" Connor C. wrote.

 

Adam writes, "Today was a bad day to switch from Chrome."

 

"Looks like my HP printer had been eating paper behind my back," wrote Peter D.

 

Jeanne P. writes, "Ironically, we came across this while looking for colleges that offer Computer Science or Software Engineering majors. The pie charts show acceptance rates."

 

[Advertisement] Universal Package Manager – store all your Maven, NuGet, Chocolatey, npm, Bower, TFS, TeamCity, Jenkins packages in one central location. Learn more today!

China Successfully Mines Gas From Methane Hydrate In Production Run

Slashdot -

hackingbear writes from a report via OilPrice.com: In a world's first, China has successfully extracted gas from gas hydrates in production run in the northern part of the South China Sea. According to the U.S. Department of Energy (DOE), global estimates vary, but the energy content of methane in hydrates, also known as "fire ice" or "flammable ice," is "immense, possibly exceeding the combined energy content of all other known fossil fuels." But no methane production other than small-scale field experiments has been documented so far. The China Geographical Survey said that it managed to collect samples from the Shenhu area in the South China Sea in a test that started last Wednesday. Every day some 16,000 cubic meters (565,000 cubic feet) of gas, almost all of which was methane, were extracted from the test field, exceeding goals for production mining. This is expected to help cut down China's coal-induced pollution greatly and reduce reliance on politically sensitive petroleum imports controlled by the US. "The production of gas hydrate will play a significant role in upgrading China's energy mixture and securing its energy security," Minister of Land and Resources Jiang Daming said on Thursday.

Read more of this story at Slashdot.

Rising Seas Set To Double Coastal Flooding By 2050, Says Study

Slashdot -

Coastal flooding is about to get dramatically more frequent around the world as sea levels rise from global warming, researchers said Thursday. Phys.Org reports, "A 10-to-20 centimeter (four-to-eight inch) jump in the global ocean watermark by 2050 -- a conservative forecast -- would double flood risk in high-latitude regions, they reports in the journal Scientific Reports." From the report: Major cities along the North American seaboard such as Vancouver, Seattle, San Francisco and Los Angeles, along with the European Atlantic coast, would be highly exposed, they found. But it would only take half as big a jump in ocean levels to double the number of serious flooding incidents in the tropics, including along highly populated river deltas in Asia and Africa. Even at the low end of this sea rise spectrum, Mumbai, Kochi and Abidjan and many other cities would be significantly affected. To make up for the lack of observational data, Vitousek and his colleagues used computer modeling and a statistical method called extreme value theory. "We asked the question: with waves factored in, how much sea level rise will it take to double the frequency of flooding?" Sea levels are currently rising by three to four millimeters (0.10 to 0.15 inches) a year, but the pace has picked up by about 30 percent over the last decade. It could accelerate even more as continent-sized ice blocs near the poles continue to shed mass, especially in Antarctica, which Vitousek described as the sea level "wild card." If oceans go up 25 centimeters by mid-century, "flood levels that occur every 50 years in the tropics would be happening every year or more," he said.

Read more of this story at Slashdot.

Apple Is Lobbying Against Your Right To Repair iPhones, New York State Records Confirm

Slashdot -

An anonymous reader quotes a report from Motherboard: Lobbying records in New York state show that Apple, Verizon, and the tech industry's largest trade organizations are opposing a bill that would make it easier for consumers and independent companies to repair your electronics. The bill, called the "Fair Repair Act," would require electronics companies to sell replacement parts and tools to the general public, would prohibit "software locks" that restrict repairs, and in many cases would require companies to make repair guides available to the public. Apple and other tech giants have been suspected of opposing the legislation in many of the 11 states where similar bills have been introduced, but New York's robust lobbying disclosure laws have made information about which companies are hiring lobbyists and what bills they're spending money on public record. According to New York State's Joint Commission on Public Ethics, Apple, Verizon, Toyota, the printer company Lexmark, heavy machinery company Caterpillar, phone insurance company Asurion, and medical device company Medtronic have spent money lobbying against the Fair Repair Act this year. The Consumer Technology Association, which represents thousands of electronics manufacturers, is also lobbying against the bill. The records show that companies and organizations lobbying against right to repair legislation spent $366,634 to retain lobbyists in the state between January and April of this year. Thus far, the Digital Right to Repair Coalition -- which is generally made up of independent repair shops with several employees -- is the only organization publicly lobbying for the legislation. It has spent $5,042 on the effort, according to the records.

Read more of this story at Slashdot.

Jeff Geerling's Blog: Call for Sessions is open for DrupalCamp St. Louis 2017 - come and speak!

Drupal Planet -

DrupalCamp St. Louis 2017 will be held September 22-23, 2017, in St. Louis, Missouri. This will be our fourth year hosting a DrupalCamp, and we're one of the best camps for new presenters!

If you did something amazing with Drupal, if you're an aspiring themer, site builder, or developer, or if you are working on making the web a better place, we'd love for you to submit a session. Session submissions are due by August 1.

Researchers Create a T-Shirt That Monitors the Wearer's Breathing Rate In Real Time

Slashdot -

"Researchers at Universite Laval's Faculty of Science and Engineering and its Center for Optics, Photonics, and Lasers have created a smart T-shirt that monitors the wearer's respiratory rate in real time," reports Science Daily. The details have been published in the latest edition of Sensors. From the report: Unlike other methods of measuring respiratory rate, the smart T shirt works without any wires, electrodes, or sensors attached to the user's body, explains Younes Messaddeq, the professor who led the team that developed the technology. "The T shirt is really comfortable and doesn't inhibit the subject's natural movements. Our tests show that the data captured by the shirt is reliable, whether the user is lying down, sitting, standing, or moving around." The key to the smart T shirt is an antenna sewn in at chest level that's made of a hollow optical fiber coated with a thin layer of silver on its inner surface. The fiber's exterior surface is covered in a polymer that protects it against the environment. "The antenna does double duty, sensing and transmitting the signals created by respiratory movements," adds Professor Messaddeq, who also holds the Canada Excellence Research Chair in Photonic Innovations. "The data can be sent to the user's smartphone or a nearby computer." As the wearer breathes in, the smart fiber senses the increase in both thorax circumference and the volume of air in the lungs, explains Messaddeq. "These changes modify some of the resonant frequency of the antenna. That's why the T shirt doesn't need to be tight or in direct contact with the wearer's skin. The oscillations that occur with each breath are enough for the fiber to sense the user's respiratory rate."

Read more of this story at Slashdot.

Elsevier Wants $15 Million In 'Piracy' Damages From Sci-Hub and Libgen

Slashdot -

lbalbalba writes: Elsevier, one of the largest academic publishers, is demanding $15 million in damages from Sci-Hub and LibGen, who make paywalled scientific research papers freely available to the public [without permission]. A good chunk of these papers are copyrighted, many by Elsevier. Elsevier has requested a default judgment of $15 million against the defendants for their "truly egregious conduct" and "staggering" infringement. Sci-Hub's efforts are backed by many prominent scholars, who argue that tax-funded research should be accessible to everyone. Others counter that the site doesn't necessarily help the "open access" movement move forward. Sci-Hub's founder Alexandra Elbakyan defends her position and believes that what she does is helping millions of less privileged researchers to do their work properly by providing free access to research results.

Read more of this story at Slashdot.

Hacker Steals 17 Million Zomato Users' Data, Briefly Puts It On Dark Web

Slashdot -

Waqas reports via Hack Read: Recently, HackRead found out a vendor going by the online handle of âoenclayâ is claiming to have hacked Zomato and selling the data of its 17 million registered users on a popular Dark Web marketplace. The database includes emails and password hashes of registered Zomato users while the price set for the whole package is USD 1,001.43 (BTC 0.5587). The vendor also shared a trove of sample data to prove that the data is legit. Here's a screenshot of the sample data publicly shared by "nclay." Upon testing the sample data on Zomato.com's login page, it was discovered that each and every account mentioned in the list exists on Zomato. Although Zomato didn't reply to our email but in their latest blog post the company has acknowledged the breach. Here's a full preview of the blog post published by Zomato 7hours ago: "Over 120 million users visit Zomato every month. What binds all of these varied individuals is the desire to enjoy the best a city has to offer, in terms of food. When Zomato users trust us with their personal information, they naturally expect the information to be safeguarded. And that's something we do diligently, without fail. We take cyber security very seriously -- if you've been a regular at Zomato for years, you'd agree."

Read more of this story at Slashdot.

Firefox 55: Flash Will Become 'Ask To Activate' For Everyone

Slashdot -

An anonymous reader quotes a report from BleepingComputer: Starting with the release of Firefox 55, the Adobe Flash plugin for Firefox will be set to "Ask to Activate" by default for all users. This move was announced in August 2016, as part of Mozilla's plan to move away from plugins built around the NPAPI technology. Flash is currently the only NPAPI plugin still supported in Firefox, and moving its default setting from "Always Activate" to "Ask to Activate" is just another step towards the final step of stop supporting Flash altogether. This new Flash default setting is already live in Firefox's Nightly Edition and will move through the Alpha and Beta versions as Firefox nears its v55 Stable release. By moving Flash to a click-to-play setting, Firefox will indirectly start to favor HTML5 content over Flash for all multimedia content. Other browsers like Google Chrome, Brave, or Opera already run Flash on a click-to-play setting, or disabled by default. Firefox is scheduled to be released on August 8, 2017.

Read more of this story at Slashdot.

Brian Osborne: Keeping a view of upcoming events fresh in Drupal 8

Drupal Planet -

Imagine you have a view that lists upcoming events on your Drupal 8 site. There's a date filter that filters out any event who's start date is less than the current date. This works great until you realize that the output of the view will be cached in one or many places (dynamic page cache, internal page cache, varnish, etc). Once it's cached, views doesn't execute the query and can't compare the date to the current time, so you may get older events sticking around.

Google To Launch a Jobs Search Engine In the US

Slashdot -

At its I/O 2017 conference, Google announced that it's launching a jobs search engine in the U.S. that will focus on a wide variety of jobs -- from entry-level and service industry positions to high-end professional jobs. The service will also use machine learning and artificial intelligence to better understand how jobs are classified and related, among other things. TechCrunch reports: In a few weeks, Google will begin to recognize when U.S. users are typing job search queries into Google Search, and will then highlight jobs that match the query. However, Google is not necessarily taking on traditional job search service providers with this launch -- instead, it's partnering with them. The company said that Google for Jobs will initially partner with LinkedIn, Facebook, Careerbuilder Monster, Glassdoor, and other services. The search engine will have a number of tools that will help you find the right jobs for you. For example, you'll be able to filter jobs by location, title, category or type, date posted or whether it's full or part-time, among other things. The service will also show applicants things like commute time, to help them figure out if the job is too far away to consider. What makes the service interesting is that it's leveraging Google's machine learning smarts to understand how job titles are related and cluster them together.

Read more of this story at Slashdot.

Dries Buytaert: Friduction: the internet's unstoppable drive to eliminate friction

Drupal Planet -

There is one significant trend that I have noticed over and over again: the internet's continuous drive to mitigate friction in user experiences and business models.

Since the internet's commercial debut in the early 90s, it has captured success and upset the established order by eliminating unnecessary middlemen. Book stores, photo shops, travel agents, stock brokers, bank tellers and music stores are just a few examples of the kinds of middlemen who have been eliminated by their online counterparts. The act of buying books, printing photos or booking flights online alleviates the friction felt by consumers who must stand in line or wait on hold to speak to a customer service representative.

Rather than interpreting this evolution as disintermediation or taking something away, I believe there is value in recognizing that the internet is constantly improving customer experiences by reducing friction from systems — a process I like to call "friduction".

Open Source and cloud

Over the past 15 years, I've watched open source and cloud computing solutions transform content management into digital experience management. Specifically, I have observed open source and cloud-computing solutions remove friction from legacy approaches to technology. Open source takes the friction out of the technology evaluation and adoption process; you are not forced to get a demo or go through a sales and procurement process, or deal with the limitations of a proprietary license. Cloud computing also took off because it also offers friduction; with cloud, companies pay for what they use, avoid large up-front capital expenditures, and gain speed-to-market.

Cross-channel experiences

Technology will continue to work to eliminate inefficiencies, and today, emerging distribution platforms will continue to improve user experience. There is a reason why Drupal's API-first initiative is one of the topics I've talked and written the most about in 2016; it enables Drupal to "move beyond the page" and integrate with different user engagement systems. We're quickly headed to a world where websites are evolving into cross­channel experiences, which includes push notifications, conversational UIs, and more. Conversational UIs, such as chatbots and voice assistants, will eliminate certain inefficiencies inherent to traditional websites. These technologies will prevail because they improve and redefine the customer experience. In fact, Acquia Labs was founded last year to explore how we can help customer bring these browser-less experiences to market.

Personalization and contextualization

In the 90s, personalization meant that websites could address authenticated users by name. I remember the first time I saw my name appear on a website; I was excited! Obviously personalization strategies have come a long way since the 90s. Today, websites present recommendations based on a user's most recent activity, and consumers expect to be provided with highly tailored experiences. The drive for greater personalization and contextualization will never stop; there is too much value in removing friction from the user experience. When a commerce website can predict what you like based on past behavior, it eliminates friction from the shopping process. When a customer support website can predict what question you are going to ask next, it is able to provide a better customer experience. This is not only useful for the user, but also for the business. A more efficient user experience will translate into higher sales, improved customer retention and better brand exposure.

To keep pace with evolving user expectations, tomorrow's digital experiences will need to deliver more tailored, and even predictive customer experiences. This will require organizations to consume multiple sources of data, such as location data, historic clickstream data, or information from wearables to create a fine-grained user context. Data will be the foundation for predictive analytics and personalization services. Advancing user privacy in conjunction with data-driven strategies will be an important component of enhancing personalized experiences. Eventually, I believe that data-driven experiences will be the norm.

At Acquia, we started investing in contextualization and personalization in 2014, through the release of a product called Acquia Lift. Adoption of Acquia Lift has grown year over year, and we expect it to increase for years to come. Contextualization and personalization will become more pervasive, especially as different systems of engagements, big data, the internet of things (IoT) and machine learning mature, combine, and begin to have profound impacts on what the definition of a great user experience should be. It might take a few more years before trends like personalization and contextualization are fully adopted by the early majority, but we are patient investors and product builders. Systems like Acquia Lift will be of critical importance and premiums will be placed on orchestrating the optimal customer journey.

Conclusion

The history of the web dictates that lower-friction solutions will surpass what came before them because they eliminate inefficiencies from the customer experience. Friduction is a long-term trend. Websites, the internet of things, augmented and virtual reality, conversational UIs — all of these technologies will continue to grow because they will enable us to build lower-friction digital experiences.

Climate Change is Turning Antarctica Green, Say Researchers

Slashdot -

Researchers in Antarctica have discovered rapidly growing banks of mosses on the ice continent's northern peninsula, providing striking evidence of climate change in the coldest and most remote parts of the planet. Amid the warming of the last 50 years, the scientists found two different species of mosses undergoing the equivalent of growth spurts, with mosses that once grew less than a millimeter per year now growing over 3 millimeters per year on average, (the link could be paywalled; alternative source below) the Washington Post reported on Thursday. From a report: "Antarctica is not going to become entirely green, but it will become more green than it currently is," said Matt Amesbury, co-author of the research from the University of Exeter. "This is linking into other processes that are happening on the Antarctic Peninsula at the moment, particularly things like glacier retreat which are freeing up new areas of ice-free land -- and the mosses particularly are very effective colonisers of those new areas," he added. In the second half of the 20th century, the Antarctic Peninsula experienced rapid temperature increases, warming by about half a degree per decade. Plant life on Antarctica is scarce, existing on only 0.3% of the continent, but moss, well preserved in chilly sediments, offers scientists a way of exploring how plants have responded to such changes.

Read more of this story at Slashdot.

Lullabot: Modernizing JavaScript in Drupal 8

Drupal Planet -

Mike and Matt host two of Drupal's JavaScript maintainers, Théodore Biadala and Matthew Grill, as well as Lullabot's resident JavaScript expert Sally Young, and talk about the history of JavaScript in Drupal, and attempts to modernize it.

Font Sharing Site DaFont Has Been Hacked, Exposing Thousands of Accounts

Slashdot -

A popular font sharing site DaFont.com has been hacked, resulting in usernames, email addresses, and hashed passwords of 699,464 user accounts being stolen. ZDNet reports: The passwords were scrambled with the deprecated MD5 algorithm, which nowadays is easy to crack. As such, the hacker unscrambled over 98 percent of the passwords into plain text. The site's main database also contains the site's forum data, including private messages, among other site information. At the time of writing, there were over half-a-million posts on the site's forums. The hacker told ZDNet that he carried out his attack after he saw that others had also purportedly stolen the site's database. "I heard the database was getting traded around so I decided to dump it myself -- like I always do," the hacker told me. Asked about his motivations, he said it was "mainly just for the challenge [and] training my pentest skills." He told me that he exploited a union-based SQL injection vulnerability in the site's software, a flaw he said was "easy to find." The hacker provided the database to ZDNet for verification.

Read more of this story at Slashdot.

Google Launches Google Assistant On the iPhone

Slashdot -

At its I/O 2017 developer conference, Google announced the Google Assistant is coming to iOS as a standalone app. Previously, the only way for iOS users to get access to the Assistant was through Allo, the Google messaging app nobody uses. For those interested, you can download the Google Assistant on your iOS device here, but keep in mind that your device needs to be running iOS 9.1 or higher. VentureBeat reports: Google Assistant for iPhone won't ship on Apple's mobile devices by default, and naturally won't be as tightly integrated into the OS. But it is addressable by voice and does work with other Google apps on Apple's platform. Apple has API restrictions on iOS, so Google Assistant can't set alarms like Siri can. It can, however, send iMessages for you or start playing music in third-party apps like Spotify. You also won't be able to use the Home button to trigger Google Assistant, so you'll need to use the app icon or a widget.

Read more of this story at Slashdot.

Pages

Subscribe to Heydon Consulting aggregator