Cloud Computer week 2

Here we are, week 2 of BSIT 375.

In researching some comparisons of NAS and Cloud Computing I came across some information that I would like to share.  There are some features that are kind of must haves in regards to building NAS as your type of storage for your company.

  1. Security – Your NAS will be storing confidential company data, consider the use of SSL to protect the Web management interface a bare minimum.
  2. Power Consumption – Because your NAS is likely to be switched on 24/7, energy starts becoming more important in the way of environmental impact and your electricity bill.
  3. iSCSI Support – An increasing number of NAS come with iSCSI support, making this a de facto feature.
  4. Ability to sync/backup to another NAS – Storing a copy of the data with another NAS at another physical location is invaluable in ensuring data survivability.
  5. Multi-functional Capabilities – Extra features built into a NAS can be very useful.  The availability of FTP services can be used for staging and transferring large files across the Internet – with the right firewall configuration. The ability to host Web files can be used to host internal websites or intranet portals.
  6. Deduplication – Depending on the compression ratio gained and preferred backup regime, this might be invaluable to some Small/Medium Business.
  7. On-board, hardware-accelerated encryption –  If possible, opt for NAS with on-board encryption support that can perform encryption at much faster speeds. Encryption is the best defense against vendors or business partners simply walking away with one of the companies hard drives.
  8. Support for syncing with cloud storage – This allows your business to do without configuring a firewall or VPN that is required for NAS-to-NAS backups. With that some cloud-based services have high levels of redundancy, which can reinforce data survivability a second layer of backup for data.

All of these are items to look into if you company is going to be using a NAS as its primary source of data storage and access.  Do you feel like there were any items that I missed?  Please feel free to let me know on the discussion board.  Thank you.

Advertisements

Cloud Computing

 

Week 1 Blog

Hello class.  I am Jeff Fackler.  I am originally from Brandenburg KY, but I currently live in Omaha NE.  I am married and have 2 children.  I have been in the Air Force for 19 years as a Network Admin and Manager/Section Chief of several Communications Squadron Network Control Centers.  I have my Security + and Network+ certifications.  I am currently in this class to finish up my BS in Information Technology.  I am a recent transfer from Arizona State University, due to finding out that ASU would not take 87 of my previous credits.  Sad that they do not tell you that til after you are enrolled and taking classes there.  Bellevue has been great about getting me set up for success. I have 1 year left to finish my BS and then I will be attending Creighton Law School in order to pursue Cyber Law.  That being said I have little knowledge of Cloud Computing and so for this class I hope to learn a lot.

In regards to Cloud Computing I got to researching for my blog for the class.  Something that peeked my interest, since I am in IT, and we deal with a multitude of certifications was whether or not there was a specific certification dealing with cloud computing.  Low and behold there most certainly was.  Comptia has a Cloud+ and Cloud Essentials course and exam.

Now I do not think that I will be taking the course or the exam anytime soon as my career is taking me elsewhere but I thought that some of you might be interested in some information that Comptia has to pass on in regards to this certification.

“Cloud Essentials is relevant: The exam covers situations and equipment with respect to your specific experience and expertise. You will not need to perform tasks or master technical material you will not use in your day-to-day job.

Cloud Essentials is practical: Whether you are new to cloud computing or rely on it for high-level business practices, this fundamental approach provides the exact amount of preparedness you need.

Cloud Essentials remembers every scenario: The certification exam includes a risk-and-consequences component, understanding that every business has its unique IT needs. Each one will apply cloud technologies differently. Prepare for contingencies, malfunctions, security threats and other situations that require swift, effective decisions.

Cloud Essentials values security: One of the most common worries about cloud computing is the safety of the data involved. CompTIA Cloud Essentials answers those worries and shows you how to keep your sensitive data as secure in the cloud as it would be anywhere else.” (CompTIA Cloud Essentials, 2017)

As you can see cloud computing is the here and now.  If you are going to stay in the career field of IT then it is imperative that you gain some cloud computing knowledge.

Now tell me.  Do you plan on taking the Comptia Cloud+ exam?

 

References:

CompTIA Cloud Essentials. (2017). Retrieved December 02, 2017, from https://certification.comptia.org/certifications/cloud-essentials#overview

 

Week 2 Blog – reference https://wordpress.com/post/jfackler1979.wordpress.com/9 

(I posted it there first thinking we had to create a different blog for each week.  It was posted during week 2’s timeline.)

Here we are, week 2 of BSIT 375.

In researching some comparisons of NAS and Cloud Computing I came across some information that I would like to share.  There are some features that are kind of must haves in regards to building NAS as your type of storage for your company.

  1. Security – Your NAS will be storing confidential company data, consider the use of SSL to protect the Web management interface a bare minimum.
  2. Power Consumption – Because your NAS is likely to be switched on 24/7, energy starts becoming more important in the way of environmental impact and your electricity bill.
  3. iSCSI Support – An increasing number of NAS come with iSCSI support, making this a de facto feature.
  4. Ability to sync/backup to another NAS – Storing a copy of the data with another NAS at another physical location is invaluable in ensuring data survivability.
  5. Multi-functional Capabilities – Extra features built into a NAS can be very useful.  The availability of FTP services can be used for staging and transferring large files across the Internet – with the right firewall configuration. The ability to host Web files can be used to host internal websites or intranet portals.
  6. Deduplication – Depending on the compression ratio gained and preferred backup regime, this might be invaluable to some Small/Medium Business.
  7. On-board, hardware-accelerated encryption –  If possible, opt for NAS with on-board encryption support that can perform encryption at much faster speeds. Encryption is the best defense against vendors or business partners simply walking away with one of the companies hard drives.
  8. Support for syncing with cloud storage – This allows your business to do without configuring a firewall or VPN that is required for NAS-to-NAS backups. With that some cloud-based services have high levels of redundancy, which can reinforce data survivability a second layer of backup for data.

All of these are items to look into if you company is going to be using a NAS as its primary source of data storage and access.  Do you feel like there were any items that I missed?  Please feel free to let me know on the discussion board.  Thank you.

 

 

Week 3 Blog

This week I am going to talk a little bit about what I have found out about SAS-4.  I thought that it was very interesting that I had not delved into learning about SAS prior to this weeks class. I have been maintaining servers for years and not looked into it.  SAS-4 is supposed to be able to allow for 22.5 GIGABYTE per second transfers.  To tell you the truth I can only imagine that satellite pictures would need this type of transfer speed.  Looking into this further I have found out that this is because the protocols for the SAS-4 allows for hundreds of users to be able to access and write to that type of drive at the same time.  It is a very quick paced time of change in our technology today.  The release of the SAS-4 proves that the PATA drives of yesterday at 1 Megabyte per second transfers is quite slow and now the dinosaur of data storage.  I cannot wait to see what 2025 hold for us with speeds and sizes of our hard drives.  What do you think of how fast this HD is?

 

Week 4 Blog

Well now that the Holiday’s are over, it is time to get to the grind again.  With that lets talk about Hard Drive heads.  Have you ever heard that dreadful noise of grinding in your PC Case and it seems like your PC is doing nothing?  Well I have.  A few years back I lost absolutely everything on one of my IDE hard drives.  I ended up having to rebuild from scratch.  I tried to retrieve information from my hard drive with a few tools for recovery that I had gotten while working in the Air Force.  But I could get nothing.  The data was too badly damaged.  After the rebuild I took that hard drive apart.  And what to my wondering eyes should appear, but a not so shiny platter with 12 thick dug in trenches to my platters.  It literally looked like bad rotors on a car from the breaks grinding on the wheel.  Those trenches in my platters were caused by the head of the hard drive coming in contact with the platters.  I believe that the fault was in the arm.  But I am not a electronics engineer so I cannot give a definite answer on that.  But this taught me a valuable lesson of making sure I have my PC backed up to a separate hard drive at all times, which I have done so ever since.

 

Week 5 Blog

I have worked with Virtualization a few times in my career.  First was with Citrix and NetApp combination while I was stationed in Qatar AB in Doha.  The next time I used virtualization was when I was stationed in Ramstein AB Germany.  I virtualized a series of applications onto an application server and populated the “shortcut” to all the clients users desktops.  This freed up space on the actual clients while still allowing them to utilize the applications.  Their data was still locally stored and would be accessible each time they opened the virtualized instanced application.  The next time I used virtualization was when I created an iso of the machines on our ballistics network here at Offutt AFB NE.  I used Hyper-V built within windows server 2008 to create a virtual workstation that was built off the iso of another system that was currently on the network.  I did this for quicker recovery in the case that one of the physical systems crashed on us.  I like the idea of virtualizing.  At home I have also created a virtual pc that is hosted on my personal computer.  What I do with this one is I create a snapshot in Hyper-V prior to installing any software or installing updates, that way I can test whether or not the software is infected with any viruses or if the update will crash my system in some sort of way.  If this happens, I can simply restore from the snapshot that was created prior to the change.

 

Week 6 Blog 

This week our topic was about storage management.  I have had to deal with storage management and the lack there of on a constant basis over my career.  Lets take for instance my units sharepoint page.  We have a total of 1 Tb of storage space that is allocated to us from our parent Communications Squadron.  We are in charge of making sure that we do not go over this amount.  When we reach 90% of our storage capacity we receive an automatic email from the server letting us know that we are close and to purge some of our old data.  The problem that we have as administrators is that we do not know what is useful data to all of the sections of our squadron.  So when we get these notifications, we must contact the leads of each section and let them know how much data they need to delete off of the sharepoint site.  This method has worked best for us with getting our space cleaned up.  I wish that our section leads would put a outlook reminder up once a quarter to do this without us having to send out notifications.  But that is part of being the administrator of the system I guess.

 

Week 7 Blog

Since this week was about back up and storage solutions in our text, I think that I will talk about our storage at work with our 9mm Tapes.  We have a system at work that was built in 1994.  It runs off of the REACT VAX operating system.  We store backups for it since it is a very important system relating to the Nuclear Enterprise.  Now we do not have tapes that are as old as 1994 but we do have some tapes as old as 1999.  We test these tapes about 3 times a year at Vandenburg AFB to make sure that we can recover the database in the event of a catastrophic failure.  These tapes hold all the instruction sets for how the Minuteman III calculates its trajectory when flying to its target.  As you can probably guess those instructions and calculations are very important.  It is our job to make sure that we can recover using those old tapes if need be.  We also have copies of them on CD’s and hard drives.  As I said that information is very important, therefore we have a 3 tiered recovery mindset, just in case one or more of the options fail.

 

Week 8 Blog

This week I will talk a little bit about SSD’s vs SATA drives.  I had an issue this week where I was upgrading a system to a new image of Windows 10 made for the Air Force.  I tried to install to this system’s SSD 3 different times.  It would not take the image.  I decided since this particular system had 2 different hard drives in it that I would install it to the older SATA drive that was the secondary HD.  It took the image on the very first try.  That kinda racked my brain for a bit so I did some research.  It turns out that when I first installed the SSD’s over a year ago, that I had done so by cloning the SSD with the SATA drive and Samsung’s cloning software.  From what I could find out the cloning software locked down the drive to where it will not allow you to re-image over it without re-arming the drive with the Samsung cloning software.  So my next project next week will be to take each of those drives (5 total), and then re-arm them and wipe them as a new drive.  Until then the old SATA drives with Windows 10 will have to do, while we re-utilize the SSD’s for a future project.

Week 9 Blog

For this weeks blog I want to talk about Hyper-V.  I just built a Domain Controller for a project in another class, and I built it using Hyper-V.  I have used Hyper-V on many different occasions.  I like the way Hyper-V can use its snapshot function.  To me this is a very quick way of creating a backup image.  Also there are no limits to how many snapshots you can take.  I have used this function in my home computer to make an image prior to doing big updates to my OS or to my software.  This has allowed me to revert back in case the software throws something out of wack on my box.  I consider this a best practice.  If I get a virus, I just revert back to the latest snapshot and it is like that virus had never existed.  I hope to work with Hyper-V and other programs like it more in the future.