If you’re a staff member under the heel of a huge corporation you ought to be horrified by the vision of the future of linked gadgets that Microsoft revealed at its Build Developer conference in Seattle.
Two tech demonstrations from the international keynote speakers’ presentation caught people’s attention, both for being entertaining and for revealing a possibly frightening future for anyone working for a big company with the will to micro-monitor its staff members.
The first featured camera’s viewing workers on a construction worksite. The electronic cameras are connected into the cloud, where artificial intelligence screens everything in real time, noting identities of employees in addition to recognizing almost each piece of equipment on the worksite, a really handy tool in the claims process for commercial insurance brokers.
That is undoubtedly cool, particularly as the AI can instantly notice when somebody is on the worksite that should not be, or determine when someone is utilizing harmful equipment in a fashion that would be ill-advised and void the company’s builder’s indemnity insurance.
Microsoft’s demonstration purposely concentrated on a building worksite, where mishaps are too common, and a smart AI overseer sort of makes sense. Identifying OSHA infractions or intruders rapidly then communicating those details to an employer by means of mobile alerts could truly save limbs and lives.
However it comes into question, just how relevant these tools are in other workplaces. Not a location where security or safety is a primary concern, however instead, a location where employers obsessively keep track of staff members in some misguided attempt to maximise profit by chewing up and spitting out the fleshy COGS on their device.
With a surveillance system like this you couldn’t invite your good friend to drop in for lunch due to the fact that your boss would know, a notification quickly appearing on their phone. There’d be no long lunches or grabbing extra office supplies from the closet. Take a lot of smoke breaks or have a bout of indigestion that leaves you on the toilet longer than usual? The AI would have the ability to notice so quickly that your boss could satisfy you in the corridor with a bottle of antacids.
The little bit of autonomy many staff members still have in the workplace would be gotten rid of if the system the keynote speaker was demonstrating were moved from construction worksites and into more traditional workplaces.
This other workplace tech demonstration focused on Cortana, and how it might now be all over, instead of it just being on your laptop or phone. The demonstration shows a female chatting with a Cortana-powered Invoke speaker in a set planned to look like her home. Then it advised her she had a conference, so she hopped in the car, where it promptly told her traffic was going to make her late and informed her office, and then tapped her into a conference currently in progress so she could listen to the conference speaker from her car.
This sounds cool and convenient; however there was one thing Microsoft left unsaid. This woman was logged into her home and car with her work ID, which implies her employers, might now have access to information from her house and cars. If work-life balance is of any issue to you, then your house speaker system may one day inform you to rush and get to the workplace due to the fact that you’re late and you’re chronically late must be disconcerting.
These demonstrations highlight the compromises inherent in a world in which we use increasingly more connected devices. You have to give up some of your personal privacy in order to profit from a network of devices tuned to you and your impulses. But the truths of these compromises start to feel worse with Microsoft due to the fact that despite its range of customer products, like the Surface Pro and Microsoft 10 House, Microsoft is in the business of working with organisations. Those are its main clients, and it’s who Microsoft invested most of today’s keynote talking to. You are not the business model. Your business is. Asking customers to offer their data to a big faceless corporation like Google so it can sell ads is something, but asking them to offer all that data to the people who sign their checks is another.
An effective and trustworthy business computer system can be your essential performance tool. There are many choices for the right computer – some cheaper than others. But without correctly looking into and selecting a premium gadget, applications will run slowly, your computer system might crash, your project or business management system will fail, and your workers’ efficiency and performance can suffer (you might want to avoid the consequences of personal indemnity insurance from insecure and poor-performing computer systems). As you try to find the right business computer, here are a couple of functions and requirements you should think about.
The processor is important for your computer’s efficiency. The processor is exactly what allows you to run and use several applications at the same time. Not all processors are made the same. First, take a look at the variety of cores. A single core’s task is to send out all the information to the processor. If the processor includes several cores, each core is accountable for various computing jobs, instead of dealing with the whole system. This permits a much faster and much smooth computer experience. In addition, a processor’s cache is memory readily available for the processor’s short-lived storage. The more cache that is offered, the more submits the processor can rapidly obtain. Also think about the processor’s frequency. This is the speed of the processor, determined in hertz. This is a great sign of how rapidly and how well the processor can function.
Disk Drive Storage
Your hard disk drive is where all your files are kept. The quantity of storage will differ by design, but preferably search for a computer system without any smaller than 500GB of storage area. Lots of brand-new business computer systems use a minimum of 1TB of hard disk storage, enabling you to save numerous files, images and video files. You might likewise wish to think about a strong state drive, instead of a standard hard disk. They have no moving parts so they have the tendency to last longer.
Random Access Memory, or RAM, is where your computer stores files for fast access. For instance, the more applications you use all at once, the more RAM you will utilize. When the majority of your RAM is made use of, your computer system will decrease substantially. Similar to a hard disk drive storage, the quantity of RAM will differ by design but try to find business computers with a minimum of 2GB of RAM to guarantee ideal efficiency throughout your work day.
If you are going to carry out regular video conferences or you operate in a field handling graphics and other multimedia, you require a display screen with a high resolution. The greater the resolution, the sharper and more comprehensive images that will exist. If you seldom deal with images or video and the only visuals you make use of are a reporting dashboard and team management modules, the resolution of the computer system ought to not be a top priority.
Ports and Connection
The more ports and connection alternatives included on your gadget, the more you can use all your different devices. Popular alternatives consist of USB 3.0, USB 2.0 and HDMI ports, together with an optical drive. However optical drives are becoming less common. If you do not have to use DVDs, it might be worth avoiding this option. USB Type-C ports are brand-new and would be an excellent way to future proof your purchase, but make sure it’s not your only port. Also, think about computer systems with Bluetooth capabilities for much easier connection with other Bluetooth-enabled gadgets such as your mobile phones, tablets and even earphones.
The service warranty on your computer is necessary to cover any technical concerns that might occur with your gadget. Several manufacturer warranties cover the parts of the computer system and any maintenance that have to be completed. The majority of company computer systems are just covered for one year (even business insurance brokers are more useful here!) but many of the very best designs are covered for 3 or more years.
In general, your objective is to discover a computing device that can handle all the work you perform in your industry. Each design will differ and it depends on you to discover a gadget that is dependable, effective and efficient in all your wants and asks.
In 2016, online education continued to become increasingly popular. More trainees are now enrolled in online courses than ever. World class universities have even begun trialling virtual and augmented reality as a learning tool. Digital companies released nano-degrees; while employers including Google and Goldman Sachs, started exploring and adopting online digital credentials. On the other hand, enrolment for on-campus degrees was reduced for the majority of United States company schools
As the calendar rolls over to 2017, here’s our list of the top 3 educational technology innovations to keep an eye out for this coming year:
Artificial Intelligence (AI) is becoming increasingly popular as a tool for online education despite it being one of the most overrated innovations of 2016. By utilising computer systems processing power and combining it with the cognitive abilities of the human brain, AI is already used heavily in customer tech markets. While specifically used to power applications like Siri for the Apple products and Alexa for Amazon products; Jozef Misik, handling directorof Knowble, a language tech start-up whose products are built on AI predicts this technology will extend beyond personal assistants into the broader education industry in the near future. The increased improvements in user accessibility of AI programs combined with the falling cost of the technology will result in more educators integrating these applications into their courses
The benefit for instructors will be devices capable of carrying out standard training jobs. Algorithms can aid instructors to assess learning effectiveness as well as support delivery of content. Deep learning systems can read, compose and mimic the behaviour of people. Current examples include the “intelligent tutoring” system pioneered by Colorado State University to improve marking reliability of assessments, to the virtual education aid, ‘Jill Watson’ utilized by Georgia Tech in 2015, or the average physics tutor.
There will be opposition to AI, such as when Moocs was introduced. Certain professors will undoubtedly be against learning to integrate the advancement of applications and the design of algorithms to successfully manage these new platforms. Concerns that staff will be displaced are already an issue raised. And as other eagerly integrated innovations have revealed, students will not always adopt new technology unless there is a real benefit
“AI in education is not inevitable — but it’s necessary,” says Satya Nitta, director of education and cognitive sciences at IBM.
Need for a conventional university degree has reduced, meaning there is currently a real focus on alternative “micro” credentials. These courses are not part of degrees and instead deliver specific knowledge and skills in fields like information science where there is a scarcity of skill. The cost is significantly lower than that of an MBA and by taking multiple courses a personalised academic experience can be created.
With more companies recognising online education credentials, and professionals switching across employers and careers more regularly combined with the higher participation of startup events, anticipate the trend to grow rapidly. People will have to continuously re-skill, claims Anant Agarwal, CEO edX, an online educational platform. “This is the age of continuous learning,” he says. “Even when you own a master’s degree, education does not stop there.”
In 2012, when Moocs initially expanded, they were considered a threat to on-campus learning. However, in recent years universities have started mixing ‘bricks and clicks’. Many prestigious schools of business now operate well regarded “combined” versions of their MBA degrees that integrate both remote online lessons and in person evaluations. Approaching things this way has allowed educational institutions to maintain intimate environments where students can network with a chemistry tutor for example, while still offering flexibility.
In fact, the majority of universities are integrating tech into actual classrooms to improve educational outcomes. Technology will repeatedly alter how educators communicate with students and deliver knowledge. Course instructors are pushing for students to be familiar with content before lessons so that they are instead places for debate about key ideas and conflicting theories.
Using video cameras, GPS and microchips, NTU system tested effectively at Yishun website.
Through the aid of microchips, cams and satellite tracking, enormous tower cranes may someday be controlled from another location. By using both digital software that works out the best lifting course and modern cameras, a crane operator can now complete any task that would normally done from the crane cabin remotely. With Global Positioning System (GPS) tracking they can also evaluate how effectively they have followed the path calculated by the digital software
Researchers at Nanyang Technological University (NTU) created the “smart crane system” that was trialled recently at the Yishun executive condominium site by Kimly Building and construction site. One of the core features noticed during the trialling was the enabled by microchip precast part tracking.
Precast building is a method used to boost productivity, but is often still labour-intensive. This is due to workers having to enter every concrete part that arrives on site into a database. NTU’s system differs as Radio Frequency Identification tags are used instead. This allows for each slab to be scanned, which allows the information to be automatically entered into the database. The result is that real-time project models are created meaning the Building Information Modelling (BIM) system is updated automatically as individual slabs are moved into place.
“From the beginning, it’s electronic information that can be passed from one process to another,” described NTU research study fellow Meghdad Attarzadeh. Gains in performance have already be seen with an increase in efficiency of 10%-20% for site logistics and 30% reductions in time costs for checking of inventory
The BIM system is also utilized to compute the most effective and least dangerous lifting path of every single item. A GPS sensing unit attached to the hook of the crane monitors how closely the operator is matching the best course.
Often tower crane operator’s vision is obstructed, meaning they typically depend on signalmen to guide them from below. The new crane hook cameras now allow the crane operator to see exactly where they are positioned relative to the cranes surroundings.
“For the very first few aspects, it was a challenge (for the crane operators),” said Mr Choo director of Kimly Construction. “But by the last few, when they had built up the confidence, the speed really picked up”.
NTU Associate Professor Robert Tiong, leader of the research group, stated: “The dream is for a game-changing ‘smart crane’ system.” The system might permit cranes and drake low loadersto be run remotely in the future he also noted.
The partnership started in December in 2015, with NTU team investing months establishing and evaluating the system. For more than six weeks across October and November, the new system was implemented to build floors 10 and 11 on two blocks of the project. The trial was solely funded by Building and Construction Authority’s Productivity Innovation Project Scheme.
Researchers at NTU are looking to further develop the system by partnering with organisations like the JTC Corporation; a nationwide industrial developer. Kimly Construction is also hoping to use the system for future projects that are suitable. As Mr Choo States, “It helps the (precast) process become a more systematic method of construction,”
Is application testing for mobile similar to standard testing for software?
If the answer to this question was ‘yes’, then the lives of software would be easier, but unfortunately it’s a resounding no. Testing of mobile applications varies in several ways as the device range is much greater meaning standard software testing courses don’t cut it as you are testing for so many different things in a mobile software testing course
One of the key distinctions is the way in which Appium; the most popular screening tool for mobile apps differs from Selenium; which is one of the most common tools for screening of non-mobile applications. Appium and Selenium operate differently meaning the degree of automation allowed for in tests is implicated in a significant way.
By comparing Selenium and Appium against one another, I’ve been able to gain some insights about how to maximise the use of each tool. Below is what I found out.
Selenium the popular browser based suite of automation tools, is a vital application for screening since it automates web browser testing.Automating tests for web applications is not the only useful feature of Selenium, automating online administrative jobs is another useful feature. Because of this a lot of DevOps Engineers feel you’re essentially getting double the value with Selenium.
It’s important to remember that Selenium isn’t one single tool but instead several smaller tools which when combined make up two central functional parts
The largest issue with Selenium; which has become obvious due to the popularity of smart phones for everything, is that the program was not developed for mobile app automation testing. Fortunately Appium addresses this core issue. Based on Selenium (just like Selendroid), Appium is a program specifically developed for automated testing of mobiles.
Appium Architecture: Wrritten in Node.js using a REST API, Appium is a HTTP webserver. Apps do not need to be recompiled due to Appium employing automation frameworks provided by the vendor like Google’s UiAutomator or Apple’s UIAutomation. This allows for the app that you test to become the app that is delivered. Language-locking (also known as framework-locking) is excluded as automation frameworks provided by the vendor are embedded inside the WebDriver API. Client-server protocols are specified by this API. This allows for the appropriate HTTP server requests to be written in any language by the client.
Only a couple of alterations are needed to allow for the WebDriver protocol to deal with mobile as when it comes to web browser automation, WebDriver is the standard. Another benefit of Appium is that it’s open source. Github has freely available information on the Appium architecture if you want to know more.
Appium Sessions: In Appium, a session is simply the automation of an individual mobile internet browser test. A HTTP POST request to the server is sent out by every client with a “desired capabilities element (also known as a JSON object). Thus this allows for the server to simultaneously launch the session and respond with a session ID which can then be utilised later on in the session. A set of keys and values called desired capabilities indicate what type of automation sessions a user conducts by sending this information directly to the Appium server. Other capabilities can also be used during automation to modify server behaviour (a complete list of capabilities can be found through a quick google search). For the reasons above, Appium is viewed as one of the best methods for automating application screening for mobile web.
Testing of mobile applications is unique and differs greatly from standard testing of web applications due to multiple devices requiring a varied software tester to deal with different operating system versions. However, automation tools for internet browsers like Selenium can have their frameworks modified to make them compatible for mobile test automation. Thus Selenium can be used like Appium. Obviously, regardless of whether you choose Selenium or Appium, both tools can be utilised through Sauce Labs’ cloud-based testing platform.
The DEF CON 24 2016 event hosted the DARPA Cyber Grand Challenge which had seven automated security systems teams face off in a capture the flag type competition for the main prize of two million dollars.
The objective was to design programs that in an autonomous fashion detected vulnerabilities and self-patched to fight off system intrusions.
This technology is interesting in that it uses the intricacies related to today’s network environments to improve service security in several different ways. However, a key question is whether this technology is worthwhile as there are various unintentional negative impacts and dangers connected to patching done autonomously. And, while still a niche area, leaders of the industry should start considering how advancements in information security programs and connected systems could go awry in the future
It is true that many businesses could utilize some type of automated patching innovation. This would free up resources in many businesses as flaws in software would be acknowledged and patches would be launched automatically meaning future attacks on the system would be ineffective. The idea of automated patching technology is very appealing to both auditors and executives. However, patching in reality is a complicated process that encompasses several systems, people and processes that operate in tandem to ensure any updates released are released effectively to improve security and ensure there is stability in the future network environment.
Stability in the environment is the top priority of many security professionals as many have had first-hand experience of one bad patch taking down what was otherwise a completely stable system. Bad patches are typically seen as a worse outcome for the business than the initial security weakness that the patch was trying to fix for obvious reasons.The software quality of contemporary operating systems and applications is better than ever before, and by working through a software tester it is entirely possible to release patches in a semi-automated style. As a bare minimum, semi-automation should occur at the workstation level for third-party software applications from vendors such as Java and Adobe.
However, a key question is how can autonomous software and firmware updates be applied to servers, applications, databases and network infrastructure systems? Furthermore, the cost of patching via automated systems relative to the benefits have many experts questioning whether it is worthwhile. Possibly the advantages of automated patching do surpass the drawbacks, but it is necessary to point out that the DARPA Cyber Grand Challengeused a specific testbed of computer systems with customized software applications that had previously not been evaluated.
As DARPA is the first ever automated patching competition in information security, it is impressive that the systems built by competing teams took only 10 hours to find and patch vulnerabilities that would normally take several weeks or months to resolve. When considering complex legacy applications and enterprising software products however, it becomes unclear how effective autonomous patching systems will be. Each circumstance is unique; with the reality being, provided the minimal resources of business IT security groups and the proneness of people to make mistakes, security needs to be automated when possible.
It is in the current day that we should begin thinking about what processes in patching could or should be completed autonomously in the future. It is likely that by attempting to automate certain security functions now, we will be able to translate many of those systems and learnings into different areas which would greatly aid the development of future security efforts
When asked which laptop will meet someone’s needs, the key thing I tell them is that there is no decisive winner. There so many differing device types, and various price ranges to consider that it really does depend on who’s asking. However, what I do provide is a guide of aspects you should evaluate before spending lavishly on a brand-new computer with all the extras like an IT help desk.
Intel’s Core-based CPU’s are probably the best option when considering buying a laptop. In regards to multi-tasking and multimedia tasks the Core i3, i5 and i7 all perform better than their competitors.
Laptops with Core i3 processors are normally found in more simple systems that are aimed towards entry-level users who will only need the basics, while the large majority of laptops and computers sold through retail run Core i5.
If you’re looking to push your laptop to achieve maximum performance, then you can’t go past the Core i7. However, while it is the fastest CPU in the range, there are certain functionality issues as heat radiating from the bottom of the laptop makes it uncomfortable for long periods of actually using the laptop on your lap
The minimum amount of RAM required to ensure your system is performing at its best is 4gb. Essentially the greater the amount of RAM, the greater the number of applications you will be able to run simultaneously. Additionally, more RAM means the system can access more data at any point in time; which is highly useful if you plan to do quite a bit of image editing.
Bluetooth and Networking Wirelessly:
People commonly forget to check this when buying a new laptop and this is unfortunate as most people depend on a reliable internet connection to effectively and productively use their laptop. As a bare minimum, you should be looking at laptops that have Wi-Fi adapters which are dual-band. Most routers have 2.4GHz and 5GHz networks. By having a dual-band adapter you can work faster as 5GHz is a better connection type. Furthermore, the dual-band will let you keep separate your laptop from other devices operating on the 2.4GHz network. Good wireless connectivity is also essential for work purposes like cloud backup services and many CMS, and CRM systems.
Bluetooth is often underappreciated as most buyers may only use it to connect wireless keyboards and mouses. Make sure to look for a laptop with Bluetooth 4.0 as this will allow you to set-up other things like a Hi-Fi system without cords. This is especially great as then you can play music from your laptop from sources like Spotify or online radios through the stereo system.
Many people consider home computers as ready-to-use desktops that benefit homework and surfing the web. However, they can do a lot more. You can stream media, download HD films and programs, modify video and audio, and even play computer games. Regrettably, many computers provide restricted functionality with these jobs. That’s what makes the Acer Predator such an outstanding desktop PC.
The Predator provides much better efficiency than many other computers, it can easily deal with the rigors of an it consulting company for example. This inexpensive gaming computer system has a top-end processor and a dedicated graphics card. It has less USB ports than some computer systems, and Acer does not have as many support alternatives as other computers. However, this computer system comes with outstanding parts and outstanding speed.
The base model Predator is still a much better video gaming computer system than many other computer systems. It uses a top-tier Intel i5-4590 processor, ranked among the best CPUs by Tom’s Hardware. This processor likewise makes excellent scores from Passmark’s benchmarking tests. On its own, the CPU is strong enough to run any task that you toss at it, but the Predator also gives you a strong mid-grade graphics card to help with the graphical processing. This allows you to play video games on high settings and run other graphics-intensive programs without putting pressure on the CPU.
Like the majority of the computers we reviewed, the Predator is also all set for upgrades. If you begin to discover significant slowing or your computer system cannot handle computer games you desire, it’s easy to open the casing and set up brand-new parts by yourself. Even if you pay an expert to install brand-new components, you’re still saving over buying a new desktop.
Memory & Storage
At this point, the standard setup for a computer is a SATA hard drive and 8GB of RAM. The Acer Predator does not roam from this formula with its high-speed 1TB hard drive. You can set up a secondary SATA or SSD in one of the offered storage bays. In addition to the set up 8GB of RAM, you can install up to 32GB in the 4 memory slots. The included memory is fantastic if you’re processing or compressing media, but generally, you don’t need more than 8GB to run a 64-bit operating system. Lastly, the computer includes a DVD player.
The Acer Predator has an interesting visual– it’s mostly black with orange highlights around the edges. On top of its appealing style, it is very simple to open and personalize. There are lots of bays readily available for new hard disk drives, RAM and hardware cards. It features a SD card slot, a HDMI port, a VGA display port and just five USB ports: three USB 3.0 ports and 2 USB 2.0 ports. This is frustrating for a video gaming PC since a lot of peripherals require USB ports.
Acer offers a 1-year service warranty with all brand-new desktop PCs, and if you desire a longer assistance duration, you can buy a two-year service warranty. This is not as long as lots of other companies offer for their extended warranties, though. There are a number of customer support alternatives readily available to customers, including cloud computing services. Telephone support lasts as long as you have your guarantee. You can also call support representatives by live chat. We could not discover an email address for assistance concerns. Acer’s assistance pages likewise provide user online forums and driver downloads.
The Acer Predator is a leading personal computer with far more emphasis on performance than any other computer system on our lineup. With its terrific parts, you can utilize it for video gaming and other resource-intensive jobs. It just has 5 USB ports, so you may lack ports when plugging in your peripherals. Still, the Predator is among the best house PCs we evaluated.
NAS boxes are nigh on perfect gadgets for keeping and streaming your multimedia collections- including music, video, and photograph throughout your home. Generally, that streaming hasn’t consisted of UHD/4K/2160p video, which requires a reasonable bit of CPU power. Until Synology’s DS216+ and it’s new DSM 6.0 os.
Prior to you getting too stoked. Synology’s NAS boxes can just transcode 2160p to 1080p (or a lower resolution, depending upon the gadget getting the stream). The very same opts for Synology’s primary competing QNAP. That implies you can stream UHD/4K, but not at its real resolution. Keep in mind that I’m speaking about using Synology’s incorporated Video Station player or its DLNA server. You can open any file and play it at complete resolution if your TELEVISION or gadget supports network browsing and has the computational horsepower.
Synology’s DS216+ is one of the faster customer NAS boxes for $300 (drives not included)- The business sent it to us to test the transcoding. It’s equipped with an Intel Celeron N3050, 1GB of memory, and with 2 drive bays, so you can add up to 16TB of storage. In our copy tests, composes and reads of a single 20GB file continued at about 109MBps, and with a more exhausting 20GB mix of smaller sized files and folders, at about 63MBps. That’s fast for consumer-grade NAS, and it makes the DS216+ a great repository for backups– more on that subject later on.
The DS216+ has 2 USB 2.0 ports on the back and a single USB 3.0 port on the front for copying files to the box. There’s a dedicated copy button on the front panel: Pressing it will move all the files from a USB drive to the box. There’s likewise a single gigabit ethernet port, and somewhat unusually for a small office/home box of recent vintage– an eSATA port. A full-on USB 3.x (5Mbps/10Mbps) port would be much better suited for the desired market.
To check transcoding and streaming I loaded the DS216+ with numerous test files, consisting of about a dozen 2160p (UHD 3840 × 2160 and 4K 4096 × 2160) videos. Files were streamed to Windows Media Player enhanced with the LAV DirectShow filters using the in-browser Video Station player. Media Player Classic House Movie Theatre was also used as a test control. Which was likewise set to use the most recent variation of the LAV filters.
Everything up to and consisting of 1080p played or streamed fine. 2160p videos (AVC and HEVC) played well also, a minimum of those restricted to about 30 frames per second and around 6 megabits per second (HEVC) or 20Mbps (AVC). AVC is significantly easier to process than the more heavily compressed HEVC. Beyond that, both audio and video stutter started to sneak in; 60fps files weren’t acknowledged. My only other problem is that the downscaling in Video Station could have used more anti-aliasing in locations with lots of fine detail.
As I mentioned up front, you can bypass the streaming and transcoding to play files directly if your gadget supports playing files from network locations. Doing that with MPC-HC, everything played simply well, including 20Mbps/60fps 2160p.
Synology’s audio support is absolutely great. The list of supported types consists of FLAC, MP3, WMA, M4A, Ogg, Ape, both Apple and Windows lossless, as well as all wave files from 44.1 kHz/16-bit to 96kHz/32-bit– including 5.1- and 7.1-channel surround types. That’s everything I have in my collection of test files outside of Opus and an ancient VQF file that’s long out-of-date. You can play any of the supported types using the included Audio Station app or streamed through DLNA.
Supported image formats include JPEG, BMP, GIF, PNG, and TIFF.
DSM or DiskStation Manager is the os for Synology’s NAS boxes. Just like competing QNAP’s QOS, it’s a full windowing system that works within your web browser. Below is a picture, which does it more justice than any word I could write. It works similar to Windows, Linux (which it in fact is), or OS X with clicking, dragging, lasso-ing, and so on
6.x brings the os into the 64-bit world, which is of limited worth to the majority of home users, however, will allow more onboard memory in the Synology’s high-end boxes. DSM 6.0 likewise supports Btrfs (B-tree file system) with its copy-on-write (COW) innovation that allows for simple cross-device storage pools and information snapshots.
Beyond streaming multimedia, a few of the other things you can do with a Synology NAS box are a record and search the output of a minimum of one Cam, create your very own email server, and– its most recent function– team up with other users on spreadsheets. It is inevitable that word processing and presentations will eventually be included as well. Then there’s the ability to watch and tape TELEVISION (with a USB tuner connected), Wi-Fi connection (with an 802.11 x dongle attached), and more.
There’s likewise a new MailPlus app and server that supports as much as five users for free. It’s quite a bit slicker than the typical MailServer application and has actually devoted apps for Android and iOS devices. Mentioning which, Synology provides mobile apps for seeing images, videos, etc. kept on the box. There’s a lot more, but I’ll have you check out Synology’s website for more details.
Save your stuff
Though this short article is concentrated on multimedia serving, the DS216+ and DSM 6.0 also offer outstanding backup services. In addition to Time Machine support for Macs, this Synology uses its own Cloud Station with customers for Windows, OS X, Linux, Android, and iOS. Which means the Synology box can be used to back-up all your PCs and mobile devices. My only issue with this was that throughout a rather big 400GB initial backup, the client used excessive bandwidth. There’s no throttle setting, so I had to stop briefly the sync process when I needed to use another network application.
New to the mix are an enhanced variation of the Backup & Restore app called Hyper Backup Vault, and Snapshot Replication, which leverages the brand-new Btrfs file, system’s picture capabilities.You can likewise sync multiple NAS boxes (even other vendor’s, if they support RSync) throughout the world. There are also apps like S3, Glacier, DropBox and other online services that can be used to back-up or sync.
The Synology DS216+ is a fantastic little box and streams multimedia equivalently to any NAS box in its class. If you’re trying to find the main repository for your tunes, pictures, and films that any device in your house can access, you might do far worse.
But if you’re not dead set on the 2160p transcoding, you can get away far cheaper with other NAS boxes– Such as Synology’s less-expensive designs, which there are plenty. When just opening and playing files from a network drive, I have the ability to play 4K/UHD files just great from a far older Synology DS411 Slim.
DSM 6.0 brings Synology’s boxes up to date in terms of backup, replication, pictures and spreading storage across several gadgets. Those are functions more of interest to business users than multimedia mavens, however, they’re welcome nevertheless.
One caution on the whole NAS for multimedia offer: Streaming and accessing files on a NAS box is simple enough for anyone, however, setup needs a reasonable quantity of tech savvy.