India’s Richest Man is Ready To Take on Amazon and Walmart

January 2, 2020

As Amazon and Walmart-owned Flipkart spend billions to make a dent in India’s retail market and reel from recent regulatory hurdles, the two companies have stumbled upon a new challenge: Mukesh Ambani, Asia’s richest man.

From a report:Reliance Retail and Reliance Jio, two subsidiaries of Ambani’s Reliance Industries, said they have soft-launched JioMart, their e-commerce venture that works closely with neighborhood stores, in parts of the state of Maharashtra — Navi Mumbai, Kalyan and Thane. The e-commerce venture, which is being marketed as “Desh Ki Nayi Dukaan” (Hindi for new store of the country), currently offers a catalog of 50,000 grocery items and promises “free and express delivery.” In an email to Reliance Jio users, the two aforementioned subsidiaries that are working together on the e-commerce venture said they plan to expand the service to many parts of India in coming months. The joint venture has also urged Jio subscribers to sign up to JioMart to access introductory offers. A Reliance spokesperson declined to share more. The soft launch this week comes months after Ambani, who runs Reliance Industries — India’s largest industrial house — said that he wants to service tens of millions of retailers and store owners across the country. If there is anyone in India who is positioned to compete with heavily backed Amazon and Walmart, it’s Ambani. Reliance Retail, which was founded in 2006, is the largest retailer in the country by revenue. It serves more than 3.5 million customers each week through its nearly 10,000 physical stores in more than 6,500 Indian cities and towns. Reliance Jio is the second largest telecom operator in India, with more than 360 million subscribers.

Source: Slashdot.org

Day 1 of 365 @ 2020

January 2, 2020

Made a small start in mastering C Programming language by mastering first 2 chapters of Let Us C book.

I tried studying other subjects (UX) along with the C foray but was not comfortable. Will do intensive study all of C first and only then will move to something else.

Will master 2 more chapters from the same book today. Making my hand written notes & really enjoying learning stuff after a long time.

2020 and Me

December 31, 2019

I have made many a resolutions for the new year 2020.

01 – Reading all 20+ UX Books that I have ASAP
02 – Learning SketchAPP
03 – Learning HTML5 & CSS3
04 – Learning JavaScript & jQuery
05 – Learning Java from UDEMY
06 – Learning C from UDEMY and all books that I have
07 – Learning C++ from UDEMY and all books that I have
08 – Learning Java, Data Structures, Advanced Algorithms & Design Patterns
09 – Learning Python & Django from UDEMY
10 – Learning Databases & MySQL from UDMEY
11 – Learning all courses at Interaction-Design.org
12 – Reading all PDFs that I have from Summaries.com

2021 will begin, I will be ready for anything.

Shopify CEO Says Long Hours Aren’t Necessary For Success

December 28, 2019

Tobi Lutke, the founder and CEO of $48 billion e-commerce cloud-software company Shopify, took to Twitter to remind us all that we don’t need to work 80 hours a week to be successful.

Business Insider reports:”I realize everyone’s twitter feed looks different. But I’ll go ahead and subtweet two conversations that I see going by right now: a) How the heck did Shopify get so big this decade and b) You have to work 80 hours a week to be successful,” he tweeted. He says he and his cofounders have grown this company from a profitable bootstrap to its multibillion-dollar status without him ever sleeping under his desk. “I’ve never worked through a night. The only times I worked more than 40 hours in a week was when I had the burning desire to do so. I need 8ish hours of sleep a night. Same with everybody else, whether we admit it or not,” he tweeted.

Shopify has had a spectacular few years. Its revenues have doubled since 2017, solidly beating Wall Street estimates quarter after quarter, growing from over $171 million in Q3 September, 2017, to over $390 million in Q3 September 2019, its latest complete quarter. It’s expected to finish the year at about $1.5 billion in revenues. And Wall Street has noticed. Shopify went public in 2015. In the past year, the stock has soared over 200% from around $134 to about $407 giving the company a $47.6 billion market cap. But even at the scale of its current operations, he says he doesn’t let his job overshadow the rest of his life. “I’m home at 5:30pm every evening. I don’t travel on the weekend. I play video games alone, with my friends, and increasingly with my kids. My job is incredible, but it’s also just a job. Family and personal health rank higher in my priority list,” he tweeted.

“For creative work, you can’t cheat. My belief is that there are five creative hours in everyone’s day. All I ask of people at Shopify is that four of those are channeled into the company,” he wrote. “What’s even better than people are teams,” he wrote. “We don’t burn out people. We give people space. We love real teams with real friendship forming.” He adds: “None of that is even about product, or market fit, or timing. It’s all about people. Treating everyone with dignity.”

“We are not moist robots. We are people and people are awesome.”

Source: Slashdot.org

Google Brain’s AI Achieves State-of-the-Art Text Summarization Performance

December 27, 2019

A Google Brain and Imperial College London team have built a system — Pre-training with Extracted Gap-sentences for Abstractive SUmmarization Sequence-to-sequence, or Pegasus — that leverages Google’s Transformers architecture combined with pretraining objectives tailored for abstractive text generation. From a report:They say it achieves state-of-the-art results in 12 summarization tasks spanning news, science, stories, instructions, emails, patents, and legislative bills, and that it shows “surprising” performance on low-resource summarization, surpassing previous top results on six data sets with only 1,000 examples. As the researchers point out, text summarization aims to generate accurate and concise summaries from input documents, in contrast to executive techniques. Rather than merely copy fragments from the input, abstractive summarization might produce novel words or cover principal information such that the output remains linguistically fluent.

Transformers are a type of neural architecture introduced in a paper by researchers at Google Brain, Google’s AI research division. As do all deep neural networks, they contain functions (neurons) arranged in interconnected layers that transmit signals from input data and slowly adjust the synaptic strength (weights) of each connection — that’s how all AI models extract features and learn to make predictions. But Transformers uniquely have attention. Every output element is connected to every input element, and the weightings between them are calculated dynamically.

Read More: SlashDot.org

Baidu Has a New Trick For Teaching AI the Meaning of Language

December 27, 2019

Baidu, China’s closest equivalent to Google, has achieved the highest score at the General Language Understanding Evaluation (GLUE) AI competition. What’s notable about Baidu’s achievement is that it illustrates how AI research benefits from a diversity of contributors.

GLUE is a widely accepted benchmark for how well an AI system understands human language. It consists of nine different tests for things like picking out the names of people and organizations in a sentence and figuring out what a pronoun like “it” refers to when there are multiple potential antecedents. A language model that scores highly on GLUE, therefore, can handle diverse reading comprehension tasks. Out of a full score of 100, the average person scores around 87 points. Baidu is now the first team to surpass 90 with its model, ERNIE.

Baidu’s researchers had to develop a technique specifically for the Chinese language to build ERNIE (which stands for “Enhanced Representation through kNowledge IntEgration”). It just so happens, however, that the same technique makes it better at understanding English as well. […] [T]he researchers trained ERNIE on a new version of masking that hides strings of characters rather than single ones. They also trained it to distinguish between meaningful and random strings so it could mask the right character combinations accordingly. As a result, ERNIE has a greater grasp of how words encode information in Chinese and is much more accurate at predicting the missing pieces. This proves useful for applications like translation and information retrieval from a text document. The researchers very quickly discovered that this approach actually works better for English, too. Though not as often as Chinese, English similarly has strings of words that express a meaning different from the sum of their parts. Proper nouns like “Harry Potter” and expressions like “chip off the old block” cannot be meaningfully parsed by separating them into individual words.

The latest version of ERNIE uses several other training techniques as well. It considers the ordering of sentences and the distances between them, for example, to understand the logical progression of a paragraph. Most important, however, it uses a method called continuous training that allows it to train on new data and new tasks without it forgetting those it learned before. This allows it to get better and better at performing a broad range of tasks over time with minimal human interference. Baidu actively uses ERNIE to give users more applicable search results, remove duplicate stories in its news feed, and improve its AI assistant Xiao Du’s ability to accurately respond to requests.

The researchers have described ERNIE’s latest architecture in a paper that will be presented at the Association for the Advancement of Artificial Intelligence conference next year.

Read More: SlashDot.org

Slashdot Asks: What’s Your Favorite Podcast?

December 25, 2019

Pocket Casts, one of the most widely used podcast apps, has shared a list of podcasts that were most subscribed by its user base this year. Top 10 podcasts this year were:1. The Joe Rogan Experience.
2. This American Life.
3. Stuff You Should Know.
4. Serial.
5. The Daily.
6. Reply All.
7. Waveform: The MKBHD Podcast.
8. Dan Carlin’s Hardcore History.
9. Radiolab.
10. Invisible.

Source: SlashDot.org

Still 7 Days for the new year 2020

December 25, 2019

Lap 001 – Begins Today (Passionate Programmer)

Facebook Owns The 4 Most Downloaded Apps Of The Decade

December 20, 2019

History may not look kindly upon it, but the 2010s really was the decade of Facebook.

From a report:

A new report from analytics firm App Annie revealed the top 10 most downloaded mobile apps of the decade, and Facebook has an unsurprising grip on the whole operation. Facebook, Messenger, WhatsApp, and Instagram were the four most downloaded apps across Android and iOS in the 2010s, and all four come from the same company. It is, if nothing else, a clear view of just how much Facebook dominated our daily lives from 2010 to 2019. Its flagship app and Messenger spin-off both have user counts in the billions, while Instagram and WhatsApp are household names.

Source: SlashDot.org

How AI Will Eat UI

December 20, 2019

The inevitable day when machines learn to design our apps.

From a report:

When AR wearables hit the market, our apps will start tracking both our conscious and subconscious behavior. By measuring our heart rate, respiration, pupil size, and eye movement, our AIs will be able to map our psychology in high resolution. And armed with this information, our interfaces will morph and adapt to our mood as we go about our day. Future interfaces will not be curated, but tailored to fulfill our subconscious needs. Maybe the best way to navigate a digital ecosystem isn’t through buttons and sliders. Maybe the solution is something more organic and abstract.

Autodesk is developing a system that uses Generative Design to create 3D models. You enter your requirements, and the system spits out a solution. The method has already produced drones, airplane parts, and hot rods. So it’s only a matter of time before we start seeing AI-generated interfaces. This may all sounds far out, but the future tends to arrive sooner than we expect. One day, in a brave new world, we will look at contemporary interfaces the same way we look at an old typewriter; gawking at its crudeness and appreciating how far we’ve come.

Read More: SlashDot.org