[gpfsug-discuss] Virtualized Spectrum Scale

Laurence Horrocks-Barlow laurence at qsplace.co.uk
Tue Oct 25 20:53:55 BST 2016


Kevin,

This is how I run test systems, I let iovirt devices to be attached to multiple kvm systems. It works well.

-- Lauz

On 25 October 2016 20:05:12 BST, Kevin D Johnson <kevindjo at us.ibm.com> wrote:
>Mark,
>
> 
>
>You can run Spectrum Scale with virtual machines.  As long as the
>virtual disks present to the file system as devices, you should be good
>to go (for example, "cat /proc/partitions" should show your virtual
>disks as devices).  I typically use scsi raw devices with virtual
>machines and that seems to work well.  KVM allows you to share the
>disks as well between hosts and that's important to emulate more
>production level uses.  It is good for a lab or to try something
>quickly without provisioning actual machines.  We have sometimes used
>SS with virtual machines in production but we typically recommend bare
>metal if/when possible.
>
> 
>
>Kevin D. Johnson, MBA, MAFM
>Spectrum Computing, Senior Managing Consultant
>
>IBM Certified Deployment Professional - Spectrum Scale V4.1.1
>IBM Certified Deployment Professional - Cloud Object Storage V3.8
>
>IBM Certified Solution Advisor - Spectrum Computing V1
>
> 
>
>720.349.6199 - kevindjo at us.ibm.com
>
> 
>
> 
>
> 
>
>----- Original message -----
>From: "Mark.Bush at siriuscom.com" <Mark.Bush at siriuscom.com>
>Sent by: gpfsug-discuss-bounces at spectrumscale.org
>To: "gpfsug-discuss at spectrumscale.org"
><gpfsug-discuss at spectrumscale.org>
>Cc:
>Subject: [gpfsug-discuss] Virtualized Spectrum Scale
>Date: Tue, Oct 25, 2016 2:47 PM
> 
>
>Anyone running SpectrumScale on Virtual Machines (intel)?  I’m curious
>how you manage disks?  Do you use RDM’s?  Does this even make sense to
>do?  If you have a 2-3 node cluster how do you share the disks across? 
>Do you have VM’s with their own VMDK’s (if not RDM) in each node or is
>there some way to share access to the same VMDK’s?  What are the
>advantages doing this other than existing HW use?  Seems to me for a
>lab environment or very small nonperformance focused implementation
>this may be a viable option.
>
> 
>
>Thanks
>
> 
>
>Mark
>
>This message (including any attachments) is intended only for the use
>of the individual or entity to which it is addressed and may contain
>information that is non-public, proprietary, privileged, confidential,
>and exempt from disclosure under applicable law. If you are not the
>intended recipient, you are hereby notified that any use,
>dissemination, distribution, or copying of this communication is
>strictly prohibited. This message may be viewed by parties at Sirius
>Computer Solutions other than those named in the message header. This
>message does not contain an official representation of Sirius Computer
>Solutions. If you have received this communication in error, notify
>Sirius Computer Solutions immediately and (i) destroy this message if a
>facsimile or (ii) delete this message immediately if this is an
>electronic communication. Thank you.
>
>Sirius Computer Solutions 
>
> 
>
>_______________________________________________
>gpfsug-discuss mailing list
>gpfsug-discuss at spectrumscale.org
>http://gpfsug.org/mailman/listinfo/gpfsug-discuss
>
> 
>
>
>
>
>------------------------------------------------------------------------
>
>_______________________________________________
>gpfsug-discuss mailing list
>gpfsug-discuss at spectrumscale.org
>http://gpfsug.org/mailman/listinfo/gpfsug-discuss

-- 
Sent from my Android device with K-9 Mail. Please excuse my brevity.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://gpfsug.org/pipermail/gpfsug-discuss_gpfsug.org/attachments/20161025/d6099148/attachment-0002.htm>


More information about the gpfsug-discuss mailing list