Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - farzanaSadia

Pages: 1 2 [3] 4 5 ... 9
31
Java Forum / What’s new in Java EE 8
« on: October 11, 2017, 08:13:01 PM »
Although Oracle has been mostly quiet lately about the progress of its enterprise Java overhaul, that is likely to change soon with the impending arrival of Java Platform, Enterprise Edition 8, better known as Java EE 8.

The upgrade retools enterprise Java for cloud and microservices environments. A vote on the Java Community Process specification for Java EE 8 is under way and is due to be completed on August 21. Java EE 8, the official specification states, is about simplification while extending the range of the platform to accommodate emerging technologies in the cloud and web. The specification also emphasizes HTML5 and HTTP/2 support.

Java EE 8 will support a multitude of Java technology specifications, including:

    JSON-B (JavaScript Object Notation Binding), providing a binding layer for converting Java objects to and from JSON messages.
    Updates to JSON-P (JSON Processing API), improving the object model.
    JAX-RS (Java API for RESTful Web Services) 2.1 reactive client API.
    JAX-RS support for server-sent events, offering a one-way channel from a server to a client.
    HTTP/2 support in Servlet. Java Servlet provides a programming class to extend server capabilities.
    Java EE Security API, accommodating cloud and PaaS paradigms.
    Bean Validation 2.0, leveraging Java 8 language constructs for use in validation. Bean Validation enables expression of constraints on object models using annotations.
    JavaServer Faces 2.3, for building server-side user interfaces.
    CDI (Contexts and Dependency Injection) 2.0, emphasizing asynchronous events.

Java EE upgrades to come faster

Java EE 8 will be followed next year by Java EE 9, as part of a two-phase effort to retool the platform for modern-day cloud and microservices deployments. Java EE 8 is centered on accommodations to configure services and on health-checking to manage services. The follow-up EE 9 release is slated to promote deployment of smaller units of services and a reactive programming model for building large-scale, event-based systems.

Built on top of Java SE (Standard Edition), Java EE offers an API and runtime environment for building and running large-scale, multitiered network applications, with security and reliability serving as key goals of the platform. The last major release, Java EE 7, became available in June 2013 and focused on HTML5 and mobility.

As part of its Java EE 8 development process, Oracle has been working on GlassFish 5, the open source application server that has served as a reference implementation for the Java EE platform. The intent is to have two GlassFish 5 promotion builds weekly to catch integration issues sooner.

Java SE is also set for an upgrade, with version 9 due on September 21 after multiple delays.
Java EE rebellion results in MicroProfile support

Last year, prominent members of the enterprise Java community rose up to protest what was perceived as stalled progress on Java EE. Oracle then rolled out its plan to revitalize the platform, noting the company had desired to retreat from earlier Java EE plans It deemed inadequate for modern computing paradigms.

One of the rebel efforts led to the development of MicroProfile, providing a baseline platform definition for microservices. The Eclipse Foundation has since taken over MicroProfile, which will still be promoted as a mechanism to accelerate adoption of Java EE 8. The current 1.1 version of MicroProfile provides a stack that includes CDI, JSON, JAX-RS, and a configuration API.

32
Java Forum / What’s new in JUnit 5 for Java testing
« on: October 11, 2017, 08:12:08 PM »
The JUnit testing framework for Java has just moved to version 5. Unlike previous releases, JUnit 5 features modules from several subprojects, including:

    Platform, for launching testing frameworks on the JVM and defining the TestEngine API via a command line.
    Jupiter, for programming and extension models for writing tests and extensions and then (via plugins) building them within JUnit, Gradle, or Maven.
    Vintage, for running JUnit 3 and 4 tests on the JUnit 5 platform.

In Jupiter, a developer can use annotations as meta-annotations, in which you define an annotation that automatically inherits the semantics of meta-annotations—a new programming model in JUnit. Also, Jupiter lets test constructors and methods to have parameters, allowing for more flexibility and enabling dependency injection for constructors and methods.

JUnit 5 requires Java 8 or higher version at runtime. But developers still can test code previous versions of the Java Development Kit. JUnit 5 artifacts do not ship with compiled module descriptors for JDK 9, but there are accommodations for JDK 9. Tests can be run on the java class path; in this regard, there are no changes between Java 8 and 9, according to documentation. Also, running JUnit Jupiter tests on the module path is implemented by pro, a Java 9-compatible build tool.

33
Java Forum / What's new in Kotlin 1.2? Code reuse, for starters
« on: October 11, 2017, 08:11:33 PM »
Version 1.2 of the statically typed Kotlin language, will offer an experimental feature enabling reuse of code across platforms, as well as compatibility with the Java 9 module system. The beta of Kotlin 1.2 is now available for download.

Kotlin’s experimental multiplatform projects capability lets developers reuse code between supported target platforms: JVM and JavaScript initially, and later native. Code to be shared between platforms is placed in a common module; platform-dependent parts are put in platform-specific modules. During compilation, code is produced for both the common and platform-specific parts.

Developers can express dependencies of common code on platform-specific parts via expected and actual declarations. This declaration specifies an API, while an actual declaration is either platform-specific to the API or a type alias that refers to an existing implementation of the API in an external library. The standard library, meanwhile, features the kotlin.math package for performing mathematical operations in cross-platform code.

The kotlin.math package also now offers better precision for math polyfills for JavaScript.

Kotlin 1.2’s standard library is compatible with newly introduced Java 9 module system, which forbids split packages (multiple .jar files declaring classes in the same package). In Kotlin 1.2, the kotlin-stdlib-jdk7 and kotlin-stdlib-jdk8 artifacts replace the old kotlin-stdlib-jre7 and kotlin-stdlib-jre8.

Also to support Java 9, Kotlin 1.2 also removes the deprecated declarations in the kotlin.reflect package from the kotlin-reflect library. Developers need to switch to using the declarations in the kotlin.reflect.full package, which debuted in Kotlin 1.1.

Type inference improvements in Kotlin 1.2 include enabling the compiler to use information from type casts in type inference. If a developer calls a generic method that returns a type parameter, such as T, and casts the return value to a specific type, such as Foo, the compiler now understands that T for this call needs to be bound to the type Foo. This is especially important for Android developers, for the Kotlin compiler to correctly analyze findViewById calls in Android API Level 26. Also, the compiler now has an option to treat all warnings as errors.

Kotlin 1.2 also has these enhancements:

    It now supports array literals in annotations, simplifying coding.
    It uses a more consistent syntax.
    The new reflection API lets developers check whether a lateinit variable has been initialized.
    The lateinit modifier now can be used on top-level properties and local variables.

Kotlin had its origins as a language for the JVM but has since been expanded to compile to JavaScript as well. The language received a boost this spring when Google endorsed it as a mechanism for building Android mobile applications, alongside Java itself.

34
Anti Virus / Google Talk used to distribute Fake AV
« on: October 11, 2017, 08:08:29 PM »
When speaking in public and delivering presentations, I am often asked “Why would they want my Google/Yahoo!/MSN/Facebook credentials? It’s only a throw-away email address.”

These services have transformed from simple webmail and messaging experiences into fully integrated platforms for video, voice, instant messaging, photo sharing, and of course social networking. As Google learned from the launch of Google Buzz, not everyone wants everything tied together in one place with Mark Zuckerberg-like openness.

Bot GTalk messageMaria Varmazis, a colleague from our Boston office, got to experience what happens when a friend’s account is compromised. When she logged onto Gmail, she got a pop-up message from someone she regularly chats with: “Hey are you on Facebook ??? If u are then check this out “. Wisely, Maria didn’t click on the link and instead passed it on to me to investigate.

The link led me to a web page that had some dancing stick people and a link that read, “Click on the picture to download my party pictures gallery. . . (Click Open or Run when prompted.)”.

Of course I wanted to view this party picture gallery. . . Past experience tells me the best pictures are taken after 11pm at parties. When I clicked the image, Internet Explorer presented a download prompt for a file called my_image_gallery.scr.
Screenshot of file download

SAV Alert FakeAV-BTWhen I tried to run the file, Sophos Anti-Virus notified me that it detected a virus, Mal/FakeAV-BT, and that it quarantined the file.

You”ll notice the size of the file was only 25K. This file, like many other fake AV programs, is simply a downloader that later retrieves its payload of malware. This allows the controllers of the botnet to decide which malware to place at the destination web page, and gives you another chance to prevent the attack by using web filtering.

Screenshot of Sophos Anti-Virus detections Had SophosLabs not already published an identity for this FakeAV, our integrated HIPS (Host Intrusion Prevention System) technology would have prevented infection as well.

HIPS detected the file’s behavior as HIPS/ProcInj-003, indicative of malware trying to inject itself into the Internet Explorer browser.

Another thing I noticed was that all of the files were in areas that did not require administrative privilege. This is a technique in greater use since Microsoft’s addition of User Access Control to Vista and later versions of Windows. This was one of the main reasons I got the results I did when testing Windows 7 against the latest 10 threats.

Screenshot of Sophos quarantine

This attack once again shows us the importance of defense in depth. An administrator for an organizational network has several chances to prevent this infection:

    Education. Teach end users how to spot something out of the ordinary, to avoid clicking links in IMs, and what techniques are used in social engineering.
    Anti-virus. As Virus Bulletin regularly demonstrates, the majority of up-to-date anti-virus products protect against most in-the-wild threats.
    Proactive protection. Using heuristic, behavioral and other techniques provides protection against malicious code that may not yet be detected by your anti-virus definitions.
    Web filtering. Both the site offering malware for me to download, and the one that was luring me into clicking the picture were blocked by the Sophos Web Appliance as malicious. Our web appliance also scans all your downloads for malware, and lets you disable downloading of dangerous filetypes.

Unfortunately, quite often our friends may not really be our friends. Use this as a reminder to stay vigilant and warn others about this type of attack.

35
Anti Virus / Malicious Web Ad Infecting Android Phones
« on: October 11, 2017, 08:07:00 PM »
Savvy Internet users know not to click on strange links, but malvertising — malicious code hidden within otherwise innocuous advertisements — presents a more pernicious problem.

A new malvertising campaign isn’t content to just redirect your web browser to unsafe sites. If you're using an Android phone, it downloads and installs an Android app that can compromise your entire phone, with no known panacea. The trap is easy to avoid, but once it’s sprung, it’s sprung for good.

screen lock device admin

This information comes from the Zscaler ThreatLabZ team, a San Jose, California-based security firm. Zscaler discovered the issue by scouring the Godlike Productions forums, a hotbed of UFO and conspiracy theory activity. For once, the tinfoil-hatted commenters had it right; someone really WAS out to get them, and that someone was a cybercriminal.
What You Need to Do

The good news is that avoiding the problem is extremely simple, and you may not even be susceptible to it in the first place. In order for apps from sources other than the Google Play store to be installed, users must go into Security-->Settings and allow apps from "Unknown Sources." That function is a security risk, and is disabled by default.

Still, if you use third-party app stores (like the Amazon Appstore), you've already enabled Unknown Sources. To disable the feature, check your phone’s settings. Enabling and disabling third-party app installation will be under the Security menu, although that menu's location may vary depending on your phone.

MORE: Best Android Antivirus Software and Apps

Advertisements on the forum automatically installed an Android APK known as "kskas.apk" to users' phones. The program calls itself "Ks Clean" and promises to clean out Android device. Once installed, though, it claims that the phone is vulnerable to a security loophole and requires an update to safeguard the device.

The update, of course, is in reality another app, and a much more malicious one. This one requires administrative privileges to install, which means that the "update" app can control your phone at the deepest level.

Once installed, the update app takes no interest in either cleaning your system or plugging security gaps. Instead, it plasters your home screen with obnoxious advertisements. While it doesn’t seem to be anything more malicious than that at the moment, it does communicate to its masters using a fairly complex command-and-control server, and could distribute actual malware if its creator so desired.

Uninstalling the app is impossible, since "update" controls the device at an administrative level. Any attempt to get rid of it forces the phone into a lock screen, and at the time of writing, there's no way around it. Your only recourse is to perform a factory reset on the phone. Depending on how much data you have saved on your device, this could range from inconvenient to disastrous.

If you have to keep installing third-party apps, you can still avoid this particular menace by just denying Ks Cleaner or its update permissions when they try to install. A good Android antivirus program should also catch the app and quarantine it before it has a chance to do any damage.

As for Godlike Productions, Zscaler was unable to find the particular ads that triggered the malicious APK, so they could be gone by now. The truth, as the site’s adherents might say, is out there.

36
Java Forum / The most popular IDEs? Visual Studio and Eclipse
« on: October 11, 2017, 08:03:12 PM »
Microsoft’s Visual Studio leads the way in desktop IDE (integrated development environment) popularity, with Eclipse close behind, according to PYPL’s August index of IDE popularity. Android Studio was a distant third.

Visual Studio takes a 22.4 percent share in this month’s index. Eclipse follows with a 20.38 percent share. Much further back was Android Studio, with a 9.87 percent share. “It’s surprising how a couple of IDEs have about half the popularity,” PYPL’s Pierre Carbonelle said.

The index is based on an analysis of how often IDEs are searched on in Google, similar to PYPL’s monthly language popularity index. The more searches for an IDE, the more popular it is presumed to be. The 10 most popular IDEs for August:

    Visual Studio, 22.4 percent
    Eclipse, 20.38
    Android Studio, 9.87
    Vim, 8.02
    NetBeans, 4.75
    JetBrains IntelliJ, 4.69
    Apple Xcode, 4.35
    Komodo, 4.33
    Sublime Text, 3.94
    Xamarin, 3.46

In 11th place was Microsoft’s open source, cross-platform development environment, Visual Studio Code, with a 2.86 percent share. Visual Studio Code reached a 1.0 release only 16 months ago.

PYPL also looked at the popularity of online development environments, using the same ranking criteria as the desktop variety. The top two lead the field by a huge margin. Cloud9 took the top spot with a 35.77 percent share, closely followed by JSFiddle with 31.42 percent. The top 10:

    Cloud9, 35.77 percent
    JSFiddle, 31.42
    Koding, 9.05
    Ideone, 5.93
    Codio, 5.92
    Codeanywhere, 4.99
    Pythonanywhere, 2.53
    Codenvy, 1.67
    Codiad, .58
    Python Fiddle, .43

37
Although Java, C, and C++ have seen drops in language popularity, they once again remain atop the Tiobe language popularity index, which uses the number of developers, courses, and vendors for each language to calculate its popularity. Their two main contenders—Python and C#—face obstacles that may keep them in the second tier.

Python actually slipped 1.32 points from its rating a year ago, while C# slipped 0.71 points in the same period.

Python and C# have long been poised to become the next big programming languages, but that hasn’t happened so far because of their limitations, notes the Tiobe report’s authors: “C# is not a Top 3 language because its adoption in the non-Windows world is still low. Python on the other hand is dynamically typed, which is a blocker for most large and/or critical software systems to use it.”

“It will be hard for [C# and Python] to become part of the Top 3,” said Paul Jansen, managing director at software quality services vendor Tiobe. “Although I think the days are over for the big three in the long term, it is unclear to me who will replace them. Just name a candidate and I can tell you why it will not reach the top three.”

But one preduction Jansen does make is that Google’s Go language will rise in the rankings. Google hit the No. 10 spot in Tiobe’s July ranking, then fell to No. 16 in the August ranking and then to No. 17 in the September ranking. But Jansen sees this downward trend for Go as a temporary glitch. “I am pretty sure it will be higher again next month.”

One reason for the difficulty in predicting language popularity in the future is that  fewer and fewer applications are being written in a single language. That means more languages have opportunities to grow their usage, and the current leaders are increasingly going to share their market share with other languages.

The  top 10 languages for the September Tiobe index were:

    Java: 12.667 percent
    C: 7.382 percent
    C++: 5.565 percent
    C#: 4.779 percent
    Python: 2.983 percent
    PHP: 2.21 percent
    JavaScript: 2.017 percent
    Visual Basic .Net: 1.982 percent
    Perl: 1.952 percent
    Ruby: 1.933 percent

In the Pypl Popularity of Programming Language index, which looks at how often language tutorials are searched on in Google, the top 10 for this month were:

    Java: 22.4 percent
    Python: 17.0 percent
    PHP: 8.7 percent
    C#: 8.1 percent
    JavaScript: 8.0 percent
    C++: 6.8 percent
    C: 6.1 percent
    R: 3.7 percent
    Objective-C: 3.5 percent
    Swift: 2.9 percent

38
Java Forum / Java debugging comes to Visual Studio Code
« on: October 11, 2017, 07:59:57 PM »
Microsoft has released a Java debugger for its free open source editor, Visual Studio Code. The newly minted extension is intended to work as a companion to the Language Support for Java extension provided by Red Hat. 

Whereas Red Hat’s Language Support for Java extension provides IntelliSense capabilities and Java project support, it does not include debugging capabilities. Microsoft’s Java Debug Extension works with previous Red Hat’s extension to provide them. Still in a preview mode, the Java Debug Extension offers capabilities including launch/attach, breakpoints, control flow, data inspection, and a debug console. The Microsoft and Red Hat extensions are available separately or in the Java Extension Pack, which bundles both together in a single install. Microsoft’s plans call for enabling a modern workflow for Java, with more features and extensions planned going forward.

Visual Studio Code is designed to be a streamlined editor with support for operations such as task running, debugging, and version control. It leaves more complex workflows to fuller-featured IDEs. Leveraging the GitHub Electron framework for building cross-platform desktop applications, Visual Studio Code runs on Windows, MacOS, and Linux.

This story, "Java debugging comes to Visual Studio Code" was originally published by InfoWorld.

39
Java Forum / Java microservices profile gets fault-tolerance capabilities
« on: October 11, 2017, 07:59:18 PM »
The Eclipse Foundation’s MicroProfile project to add microservices to enterprise Java has released MicroProfile 1.2, which adds capabilities for fault tolerance and security.
New features in MicroProfile 1.2

A fault-tolerance API in MicroProfile 1.2 provides a way for applications to deal with the unavailability of a microservice, said IBM Distinguished Engineer Ian Robinson, who has worked on MicroProfile. When old-style monolithic applications fail, they bring down the entire application. But applications composed of microservices continue to operate if a specific microservcie fails, leading to “more interesting failure scenarios,” he said. To deal with service failures, applications need a way of handling the unavailability of a service, such as to resort to a fallback service if a primary service is unavailable. Such fallbacks are what MicroProfile 1.2 allows.

MicroProfile 1.2 also adds Interoperability with JSON web tokens. With JWT, you can provider a security token in a standard format, so it propagates from one microservice to another.

An externalized configuration capability in MicroProfile 1.2 provides a standard way to import an application configuration from outside a Docker container.

MicroProfile 1.2 also features:

    Common annotations.
    Metrics.
    Health checking.

Where to download MicroProfile 1.2

The source code for MicroProfile 1.2 is available for downloads at an Eclipse webpage.

This story, "Java microservices profile gets fault-tolerance capabilities" was originally published by InfoWorld.

40


A “category one” cyber-attack, the most serious tier possible, will happen “sometime in the next few years”, a director of the National Cybersecurity Centre has warned.

According to the agency, which reports to GCHQ and has responsibly for ensuring the UK’s information security, a category one cybersecurity incident requires a national government response.

In the year since the agency was founded, it has covered 500 incidents, according to Ian Levy, the technical director, as well as 470 category three incidents and 30 category two, including the WannaCry ransomworm that took down IT in multiple NHS trusts and bodies.

But speaking at an event about the next decade of information security, Levy warned that “sometime in the next few years we’re going to have our first category one cyber-incident”. The only way to prevent such a breach, he said, was to change the way businesses and governments think about cybersecurity.

Rather than obsessing about buying the right security products, Levy argued, organisations should instead focus on managing risk: understanding the data they hold, the value it has, and how much damage it could do if it was lost, for instance.
Security breaches can lead to identity theft.

His words at the Symantec event come against the background of a major breach at the US data broker Equifax, which lost more than 130 million Americans’ personal information in a hacking attack in May. The data stolen is extremely sensitive, including names, addresses, social security numbers and dates of birth – all the information needed to steal someone’s identity online.

A further 400,000 British residents were affected by the hack, as well as a number of Canadian residents. The information stolen about them was much less personal in nature, however, consisting only of names, dates of birth, email addresses and telephone numbers.

Striking a dour note, Levy warned that it may take the inevitable category one attack to prompt such changes, since only an attack of that scale would result in an independent investigation or government inquiry.

“Then what will really come out is that it was entirely preventable… It will turn out that the organisation that has been breached didn’t really understand what data they had, what value it had or the impact it could have outside that organisation.”

Levy’s advice to organisations who want to prevent such a catastrophic breach from affecting them is to stop putting their faith in off-the-shelf security solutions, and instead work with employees to uncover what is actually possible.

“Cybersecurity professionals have spent the last 25 years saying people are the weakest link. That’s stupid!” he said, “They cannot possibly be the weakest link – they are the people that create the value at these organisations.

“What that tells me is that the systems we’ve built, as technical systems, are not built for people. Techies build systems for techies, they don’t build technical systems for normal people.”

41


Equifax has admitted that almost 700,000 UK consumers have had their personal details accessed following a cyber-attack, a figure far higher than previously thought.

As well as affecting more Britons, the hack also resulted in significantly more damaging data being leaked on those who were affected. The information lost by the US credit monitoring firm included partial credit card details, phone numbers and driving licence numbers.

The Information Commissioner’s Office said that it was still investigating the company, which had initially claimed just 400,000 British residents had been affected.

“We continue to investigate what happened at Equifax and how UK citizen’s information came to be compromised,” an ICO spokesperson confirmed. “It is a complex and fast-moving case and we are working closely with other UK regulators and our counterparts in Canada and the US.

“We have been pressing Equifax to confirm the scale and any impact on UK citizens and, from the outset, we advised the firm to alert and support victims.”

Equifax – based in Atlanta, Georgia – discovered the hack in July but only informed consumers last month, leading the information commissioner to order the company to inform British residents “at the earliest opportunity” if their personal information had been put at risk.

The move came after Equifax said a hack had exposed the social security numbers and other data of about 143 million Americans.
Equifax hack: two executives to leave company after breach
Read more

Lenders rely on the information collected by credit bureaux such as Equifax to help them decide whether to approve financing for homes, cars and credit cards.

Equifax said a file containing 15.2m UK records, dated between 2011 and 2016, was hacked and included data from “actual” consumers, as well as test and duplicate data.

The company said its investigation found that it would need to contact 693,665 British consumers by post to tell them how to protect against any potential risk.

Almost 13,000 consumers had an email address associated with their Equifax.co.uk account accessed in 2014, while just under 15,000 consumers had portions of their Equifax membership details – such as username, password, secret questions and answers and partial credit card details – accessed.

It said nearly 30,000 had their driving licence number accessed, while the phone numbers of a further 637,430 consumers were accessed.
'No law can fix stupid': Congress slams former Equifax CEO for data hack
Read more

Patricio Remon, Equifax’s Europe chief, again apologised to anyone affected by the hacking.

“It has been regrettable that we have not been able to contact consumers who may have been impacted until now, but it would not have been appropriate for us to do so until the full facts of this complex attack were known, and the full forensics investigation was completed,” he said.

Anyone who is sent a letter by Equifax should take advantage of the help offered to guard against potential risks.

Cyber-attacks have become an increasing problem for big firms that hold large amounts of customer data.

HSBC and TalkTalk are among the most high-profile British firms to be hit in recent years.

42
Software Engineering / Industry Watch: The liquification of software
« on: October 11, 2017, 07:54:41 PM »
The days of software packages are coming to an end. Say hello to what JFrog co-founder and chief architect Fred Simon calls “liquid software.”

“Once the number of applications and libraries and pieces of the software that needed to be managed reached a certain point, we started to see an exponential increase in the amount of software modules, and the frequency of updates and versions, all the way to the end user,” he began.

“What we used to consider as software packages to manage with tagging and versioning, and a destination, address number, type, barcode and then you ship it away in any kind of format – all these concepts of actually creating a package and delivering software in the form of a package, little by little has disappeared due to the fact that we are making more and more of those and releasing them more and more frequently.

“We shifted our approach to software updates, not out of packages, but out of the concept of continuous flow of software. You start to think in terms of piping, and then you start to connect the different software factories and the different departments and the different vendors and the different teams by connecting them with pipes, not by connecting them by physically delivering or on the cloud delivering the files from one place to another, but continuously providing the latest version of whatever software is available to the next destination.”

This is what Simon says (couldn’t help it!) is the liquification of software systems. “Little by little, we are seeing any kind of software in any kind of environment moving to this liquid delivery mechanism, where you plug yourself to a client that you trust to deliver clear water which is unpolluted and secure, and by the way you’ll get all security updates and the latest versions of whatever you want,” he explained.

If this sounds like the DevOps revolution, it’s because Simon said it is. “There is another catch phrase we use quite a bit to reflect this; it’s release fast or die. The ones that are not even trying to do that are probably companies we won’t see in the next decade.

At JFrog, Simon said they want to make sure the tools they are creating can be used by the “plumber, who creates the piping and lets the liquid software flow. “The replication and the pipes that are created between the different repositories, which can be located all over the world, need to continuously deliver the right things to the right destination,” he said. “All the synchronization is a critical piece of our tooling. So of course the ability to see and to transparently view the actual flow of the software. Before, when it was actual trucks, the way to control it was to control the timing of the delivery of the package. When you go to liquid software, you need visibility and transparency, but need to change the control mechanism for frequency, quality and flow of delivery.”

Liquification is a strong force in the market, but for organizations with existing processes, the move to a continuous flow of software has many challenges. “To be frank, the full liquification of software is contradicting a lot of the processes many companies set in place. There are a lot of companies who have a six-month block time before the vendor has a new version and the new version gets inside the company. It’s not rare to have such a strong mechanism of compliance and any kind of test that companies and processes set in place, where they only accept a very few releases per year. Those processes are the ones that are suffering today.

“When you have a monolith, you start with a version in your version control system and you build everything and test everything and deliver everything,” he continued. “It’s a very sequential process that for really big software could actually take weeks. Once you start in microservices, each of the microservices has its own lifecycle, so you can make your own single build and test locally and have a new version automatically created. The ability to aggregate all those microservices and to tag a specific version at a specific time and create another application out of those microservices and those different versions rapidly and efficiently is critical for the next step.”

43
You’ve probably heard the conventional wisdom that teams should be aligned around a common goal. But while it’s important to have a shared business objective, the idea that everyone needs to have the same goal is selling your engineering team short.

Productive tension is beneficial for organizations. In my engineering group, I have at least five teams with five different goals:

    Product managers want to ship features as fast as possible
    QA wants to ship features with zero bugs
    Developers want to ship features with no technical debt
    Operations never wants to ship new features
    User Experience wants to ship features that users understand intuitively

These teams should have different goals. Each brings a unique perspective and different aspects of a project to the table, and while all of their goals are important, they are not singular. If one team is winning out over the others, you quickly learn that a single goal never creates the best product for customers. Having different goals drives teams to have critical conversations. And the necessities of business force a compromise that creates the best outcome.

People sometimes ask me if I favor a product-led or developer-led culture. When you’re running a business, you always need to be customer-led. Product managers can’t run the show on their own. Neither can developers. If one team wins out, the goal of creating the best product falls by the wayside.

For instance, if product dominates, you ship features more quickly. But you accrue more technical debt, so developers can’t achieve their goal. You also ship more bugs, so QA can’t succeed. And your UX team doesn’t have time to test new designs with end users, so they miss their goal, too.

On the flip side, if developers are in charge, any semblance of a timeline disappears, as developers work at their own pace. Product managers get derailed because they have no idea when things will ship. QA gets too much to test at once, making it hard to eliminate bugs. UX still has to fight for the time to iterate and test. And Ops isn’t ready for the onslaught when features are finally released.

The ultimate goal of satisfying customers causes different teams to dominate at different times. We’ve held a product for months if our beta users found it confusing, and spent more time in UX and development to get it right. We’ve also shipped features early, with pieces missing, to gain a quick hit in customer recognition, and then added to the product over time.

To mitigate these extremes, all teams need to have a voice and be heard. They also need to be sympathetic to each other’s goals and able to compromise. There needs to be understanding and tension.

A healthy tension leads to creative solutions. If you’ve organized your teams in the right way, taking into account personalities and opinions, each faction will be advocating for something unique rather than toeing the line of the dominant party.   

One solution our team has come up with to balance these tensions is with an internal beta program. The rest of the company acts as the voice of the customer and helps us figure out when we’re ready to ship. This allows us to ship as early as possible and still please our customers. By not setting arbitrary ship dates, we can focus on speed and quality and leave our customers out of it until the product is ready.

Of course, internal betas aren’t the only solution. If your teams keep pushing to do their best work, advocating from their particular perspectives, and being sympathetic to each others’ goals, they’ll be well-positioned to get the best product out in a reasonable time.

44
Software Engineering / Amazon releases new compiler for AI frameworks
« on: October 11, 2017, 07:38:12 PM »
Amazon is addressing artificial intelligence development challenges with a new end-to-end compiler solution. The NNVM compiler, developed by AWS and a team of researchers from the University of Washington’s Allen School of Computer Science & Engineering, is designed for deploying deep learning frameworks across a number of platforms and devices.

“You can choose among multiple artificial intelligence (AI) frameworks to develop AI algorithms. You also have a choice of a wide range of hardware to train and deploy AI models. The diversity of frameworks and hardware is crucial to maintaining the health of the AI ecosystem. This diversity, however, also introduces several challenges to AI developers,” Mu Li, a principal scientist for AWS AI, wrote in a post.

According to Amazon, there are three main challenges AI developers come across today: switching between AI frameworks, maintaining multiple backends, and supporting multiple AI frameworks. The NNVM compiler addresses this by compiling front-end workloads directly into hardware back-ends. “Today, AWS is excited to announce, together with the research team from UW, an end-to-end compiler based on the TVM stack that compiles workloads directly from various deep learning frontends into optimized machine codes,” Li wrote. The TVM stack, also developed by the team, is an intermediate representation stack designed to close the gap between deep learning frameworks and hardware backends.

“While deep learning is becoming indispensable for a range of platforms — from mobile phones and datacenter GPUs, to the Internet of Things and specialized accelerators — considerable engineering challenges remain in the deployment of those frameworks,” said Allen School Ph.D. student Tianqi Chen. “Our TVM framework made it possible for developers to quickly and easily deploy deep learning on a range of systems. With NNVM, we offer a solution that works across all frameworks, including MXNet and model exchange formats such as ONNX and CoreML, with significant performance improvements.”

The NNVM compiler is made up of two components from the TVM stack: NNVM for computation graphs and TVM for tensor operators, according to Amazon.

“NNVM provides a specification of the computation graph and operator with graph optimization routines, and operators are implemented and optimized for target hardware by using TVM. We demonstrated that with minimal effort this compiler can match and even outperform state-of-the-art performance on two radically different hardware: ARM CPU and Nvidia GPUs,” Li wrote. “We hope the NNVM compiler can greatly simplify the design of new AI frontend frameworks and backend hardware, and help provide consistent results across various frontends and backends to users.”

45
Software Industry in Bangladesh / Job opportunity at Samgsung
« on: August 03, 2017, 02:02:27 PM »
There are some paid and full time internship opportunities in a renowned company.
Fresh Graduates or last semester students can apply.
Qualification:
-Candidate must have to stay 6 months
-Interested in Software Testing
-Basic knowledge in shell scripting
Please contact with Ssh Shamma mam through email by today or mail her your CV.
syeda.swe@diu.edu.bd

Pages: 1 2 [3] 4 5 ... 9