Recent Posts

Pages: [1] 2 3 ... 10
1
Software Engineering / Re: Facebook is about to start investing in local news
« Last post by s.arman on Yesterday at 08:57:16 PM »
nice
2
Machine Learning/ Deep Learning / R tutorial: How to import data into R
« Last post by s.arman on Yesterday at 08:49:46 PM »
Before you can analyze and visualize data, you have to get that data into R. There are various ways to do this, depending on how your data is formatted and where it’s located.

Usually, the function you use to import data depends on the data’s file format. In base R, for example, you can import a CSV file with read.csv(). Hadley Wickham created a package called readxl that, as you might expect, has a function to read in Excel files. There’s another package, googlesheets, for pulling in data from Google spreadsheets.

[ Get Sharon Machlis’s R tips in our how-to video series. | Read the InfoWorld tutorials: Learn to crunch big data with R. • How to reshape data in R. • R data manipulation tricks at your fingertips • Beginner’s guide to R. | Stay up to date on analytics and big data with the InfoWorld Big Data Report newsletter. ]
But if you don’t want to remember all that, there’s rio.

The magic of rio
“The aim of rio is to make data file I/O [import/output] in R as easy as possible by implementing three simple functions in Swiss-army knife style,” according to the project’s GitHub page. Those functions are import(), export(), and convert().

Source: infoworld
3
Data Mining and Big Data / Regulating Facebook won’t prevent data breaches
« Last post by s.arman on Yesterday at 08:48:36 PM »
After revelations that political consulting firm Cambridge Analytica allegedly appropriated Facebook user data to advise Donald Trump’s 2016 U.S. presidential campaign, many are calling for greater regulation of social media networks, saying a “massive data breach” has occurred.

The idea that governments can regulate their way into protecting citizen privacy is appealing, but I believe it misses the mark.

What happened with Cambridge Analytica wasn’t a breach or a leak. It was a wild violation of academic research ethics. The story is still developing, but a college researcher has now acknowledged that he harvested Facebook users’ data and gave it to another company.

A scholar and his company failed to protect sensitive research data. A university did not do enough to stop him. Regulating Facebook won’t solve these problems.

What Kogan did wrong
I am a professor of media and information policy at the Quello Center at Michigan State University, and I was one of the first academics to study the internet. The quality and integrity of digital research is of great concern to me.

I think the Cambridge Analytica-Facebook incident is a total disaster. I just don’t think it’s a government regulatory failure.

Here’s the story, at least what the media has confirmed so far.

Aleksandr Kogan is a Cambridge University data scientist and psychology department lecturer. Outside of the university, Kogan also collected and analyzed Facebook user data – presumably with the knowledge of Facebook – for his company Global Science Research.

Through online surveys, he was reportedly able to gather sensitive personal information on tens of millions of American Facebook users, including demographic data, private messages, information about their friends and possibly even information about the friends of their friends.

Kogan then provided this data to a political consulting firm, Cambridge Analytica. According to the New York Times, the company analyzed that information, aiming to help shape the 2016 Trump campaign’s messages and identify potential Trump voters.

That was never his intent, Kogan said in a March 21 BBC radio interview. He reports being “stunned” that his “perfectly legal” research on the happiness and well-being of Facebook users was deployed as a political tool.

What Facebook did wrong
So did Facebook do something wrong, then? In my opinion, not really.

Facebook already has strict guidelines outlining what can and can’t be done with user data, which the researcher appears to have violated by passing the personal data he collected to Cambridge Analytica.

When Facebook launched in 2004, it quickly became a goldmine for social researchers. Suddenly, studies that previously relied only on survey data to gather information about individuals could directly observe how people connected to one another, what they liked, and what bound groups together.

In the early years, the company took an open and experimental attitude toward this kind of data mining, even teaming up with researchers to study how tweaking certain features of individual’s Facebook pages affected voter turnout, say, or impacted their moods.

Those studies, conducted without the informed consent of its participants – Facebook users – were widely criticized by social science researchers. In 2014, Facebook strengthened its existing guidelines on how user data can be gathered, analyzed and used.

Today, the company requires an extensive internal review of every request to extract personal data from users for research purposes.

In other words, Facebook self-regulated.

It may have been lax in enforcing its guidelines, though. The company says that once it learned that Cambridge Analytica had used Kogan’s data set for unauthorized purposes, it insisted that the data be deleted.

According to current press reports, Cambridge Analytica did not comply. For a while, it seems, Facebook did nothing to punish the company.

I believe this fallout from this scandal – including a Federal Trade Commission investigation – will push Facebook to take enforcement much more seriously.

After all, as CEO Mark Zuckerberg said in a March 21 Facebook post, the company “made mistakes” and it “has a responsibility to protect” its users.

Cambridge Analytica’s Facebook account has now been suspended. And under both U.S. and U.K. law, individuals or firms accused of unauthorized disclosure of personal information can face prosecution.


Cambridge Analytica CEO Alexander Nix has been suspended over the Facebook scandal. Henry Nicholls/Reuters
What academia does wrong
For me, what the Cambridge Analytica fiasco exposes is that university ethical review processes are not yet equipped for the digital age.

University researchers are bound by strict ethical guidelines. Across the world – particularly in the U.K., with its strong social research traditions – academics who want to study the attitudes or behavior of private individuals must first pass a stringent review process. They must also obtain explicit, informed consent from those who participate in their research.

It is impossible for me to imagine that an ethics board at the University of Cambridge would have ever approved of Kogan sharing his data with Cambridge Analytica.

Universities around the globe actually encourage faculty to develop entrepreneurial companies, as Kogan did. That helps their research reach beyond campus to foster innovation in business, industry and government.

But the norms and rules that protect participants in academic research – such as not sharing identifiable personal data – do not stop at the door of the university.

Kogan’s exploits show that professors’ outside jobs may raise conflicts of interest and may have escaped the reach of institutional review. This is an area of academic work-for-hire that universities need to review with an eye toward updating how they enforce research ethics.

I’ve briefed institutional review boards at a number of universities, and I can attest that members often don’t understand how the internet has been transforming the way data is created, gathered, analyzed and shared on the internet and social media networks.

Frequently, the authorities who grant professors and students permission to conduct their studies are anchored in the standards of medical research, not modern social science.

Many schools also generally fail to understand how cutting-edge some academic fields have become. Big data and computational analytics is one of the most innovative scientific fields today.

Legitimate, company-approved access to social media user data allows researchers to study some of the most urgent issues of the 21st century, including fake news, political echo chambers and technological trends. So it is not surprising that political campaigns would want to appropriate these research practices.

Until they come up with new rules, I fear universities’ lack of digital savvy will remain a threat to online privacy.


Source: theconversation.com
4
Software Project Management / Re: 10 Icebergs That Will Sink Your Project
« Last post by s.arman on Yesterday at 08:46:36 PM »
good one
5
Thanks for sharing
6
Software Quality Assurance and Testing / Re: Test Automation Framework
« Last post by s.arman on Yesterday at 08:46:05 PM »
need help to know more
7
Software Quality Assurance and Testing / Re: Software Quality
« Last post by s.arman on Yesterday at 08:45:28 PM »
not same at all
8
Cloud Computing / Re: cloud computing new research areas
« Last post by s.arman on Yesterday at 08:45:00 PM »
Thanks for sharing
9
very informative
10
Useful Information.
Pages: [1] 2 3 ... 10