[gpfsug-discuss] GUI and pmsensors/pmcollector

Markus Rohwedder rohwedder at de.ibm.com
Fri Oct 28 16:08:17 BST 2016


Hi,

Some more questions and things to look for:

1. Are there several collectors running, potentially at different code
levels?
All collectors should run at the same code level. Simplest case, there is
only one collector.
Sensors and collectors can have different code levels.

2. Is the sensor config updated manually or per mmperfmon config update?
Manual update might be overwritten if the system thinks that the config
should be distríbuted automatically.
See Knowledge center and mmperfmon CLI command for info.

3. How does the sensor config look like (output of mmperfmon config show)?

4. Are only NSD charts showing that there is no data, or all charts?
Please note; In a SAN environment (Every node sees every NSD), there is no
NSD server side data reported.
GPFS is smart and knows to use the local device and circumvents the NSD
code stack. However, we dont get notified on traffic to the disks.

5. Which code level?

Thanks,

Mit freundlichen Grüßen / Kind regards

Dr. Markus Rohwedder

Spectrum Scale GUI Development
                                                                              
                                                                              
                                                                              
                                                                              
                                                                              
 Phone:            +49 7034 6430190      IBM Deutschland                      
                                                                              
 E-Mail:           rohwedder at de.ibm.com  Am Weiher 24                         
                                                                              
                                         65451 Kelsterbach                    
                                                                              
                                         Germany                              
                                                                              
                                                                              
                                                                              
                                                                              
                                                                              
 IBM Deutschland                                                              
 Research &                                                                   
 Development                                                                  
 GmbH /                                                                       
 Vorsitzender des                                                             
 Aufsichtsrats:                                                               
 Martina Köderitz                                                             
 Geschäftsführung:                                                            
 Dirk Wittkopp                                                                
 Sitz der                                                                     
 Gesellschaft:                                                                
 Böblingen /                                                                  
 Registergericht:                                                             
 Amtsgericht                                                                  
 Stuttgart, HRB                                                               
 243294                                                                       
                                                                              





From:	gpfsug-discuss-request at spectrumscale.org
To:	gpfsug-discuss at spectrumscale.org
Date:	28.10.2016 16:23
Subject:	gpfsug-discuss Digest, Vol 57, Issue 76
Sent by:	gpfsug-discuss-bounces at spectrumscale.org



Send gpfsug-discuss mailing list submissions to
		 gpfsug-discuss at spectrumscale.org

To subscribe or unsubscribe via the World Wide Web, visit
		 http://gpfsug.org/mailman/listinfo/gpfsug-discuss
or, via email, send a message with subject or body 'help' to
		 gpfsug-discuss-request at spectrumscale.org

You can reach the person managing the list at
		 gpfsug-discuss-owner at spectrumscale.org

When replying, please edit your Subject line so it is more specific
than "Re: Contents of gpfsug-discuss digest..."


Today's Topics:

   1. Re: GUI and pmsensors/pmcollector (Mark.Bush at siriuscom.com)
   2. GPFS Log Levels (Joshua Akers)


----------------------------------------------------------------------

Message: 1
Date: Fri, 28 Oct 2016 14:12:25 +0000
From: "Mark.Bush at siriuscom.com" <Mark.Bush at siriuscom.com>
To: gpfsug main discussion list <gpfsug-discuss at spectrumscale.org>
Subject: Re: [gpfsug-discuss] GUI and pmsensors/pmcollector
Message-ID: <66CEE4CD-BD01-4F50-839A-529062DC38D3 at siriuscom.com>
Content-Type: text/plain; charset="utf-8"

Perhaps I needed more description

I have a 3 node cluster
2 NDS?s (with tiebreaker)
1 gui node

The NDS?s all have pmsensors installed
The gui node had pmcollector installed

I?ve modified the /opt/IBM/zimon/ZIMSensors.cfg file on the NSD?s to point
to my gui node

Systemctl start pmsensors on NSD?s
Systemclt start pmcollector on gui
Systemctl start gpfsgui on gui

The log even shows connection from the two NSD?s  but for some reason the
GUI always has a red x in the performance graphs and claims it can?t
connect to the collector (which is running on the same node).

Not sure what I?m missing here.  It all works fine when it?s all on one
node.

Mark

From: <gpfsug-discuss-bounces at spectrumscale.org> on behalf of Stefan
Schmidt <stschmid at de.ibm.com>
Reply-To: gpfsug main discussion list <gpfsug-discuss at spectrumscale.org>
Date: Friday, October 28, 2016 at 8:59 AM
To: gpfsug main discussion list <gpfsug-discuss at spectrumscale.org>
Subject: Re: [gpfsug-discuss] GUI and pmsensors/pmcollector

Hi,

the PMCollector must be installed on the same node as the GUI. The
collectors can be installed on any node you want to monitor.
Hope that helps.

Mit freundlichen Gr??en / Kind regards

Stefan Schmidt

Scrum Master IBM Spectrum Scale GUI / Senior IT Architect/PMP?-Dept. M069 /
IBM Spectrum Scale Software Development

IBM Systems Group

IBM Deutschland

________________________________


Phone:

+49-703 4274 1966

 IBM Deutschland


 Am Weiher 24

E-Mail:

stschmid at de.ibm.com

 65421 Kelsterbach


 Germany

________________________________


IBM Deutschland Research & Development GmbH / Vorsitzende des
Aufsichtsrats: Martina Koederitz
Gesch?ftsf?hrung: Dirk Wittkopp
Sitz der Gesellschaft: B?blingen / Registergericht: Amtsgericht Stuttgart,
HRB 243294








From:        "Mark.Bush at siriuscom.com" <Mark.Bush at siriuscom.com>
To:        "gpfsug-discuss at spectrumscale.org"
<gpfsug-discuss at spectrumscale.org>
Date:        28.10.2016 15:53
Subject:        [gpfsug-discuss] GUI and pmsensors/pmcollector
Sent by:        gpfsug-discuss-bounces at spectrumscale.org

________________________________



I am able to do everything I need just fine when I have a single node
cluster and both pmsensors and pmcollector are installed all on the same
node.  When I try and get pmsensors on my NSD nodes and a separate
pmcollector elsewhere I never can get anything to show up in the graphs.
I?ve configured the pmsensors with mmperfmon config generate ?collectors=
(my node) and I still get nothing.  The logs show that I get connections
but the GUI never show and I?m unable to get mmperfmon query to work either
as it fails saying it can?t find the collector.

Anyone had this much trouble just getting this to work?



Mark

This message (including any attachments) is intended only for the use of
the individual or entity to which it is addressed and may contain
information that is non-public, proprietary, privileged, confidential, and
exempt from disclosure under applicable law. If you are not the intended
recipient, you are hereby notified that any use, dissemination,
distribution, or copying of this communication is strictly prohibited. This
message may be viewed by parties at Sirius Computer Solutions other than
those named in the message header. This message does not contain an
official representation of Sirius Computer Solutions. If you have received
this communication in error, notify Sirius Computer Solutions immediately
and (i) destroy this message if a facsimile or (ii) delete this message
immediately if this is an electronic communication. Thank you.

Sirius Computer Solutions<http://www.siriuscom.com/>
_______________________________________________
gpfsug-discuss mailing list
gpfsug-discuss at spectrumscale.org
http://gpfsug.org/mailman/listinfo/gpfsug-discuss


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <
http://gpfsug.org/pipermail/gpfsug-discuss/attachments/20161028/b4552796/attachment-0001.html
>

------------------------------

Message: 2
Date: Fri, 28 Oct 2016 14:22:49 +0000
From: Joshua Akers <akers at vt.edu>
To: gpfsug-discuss at spectrumscale.org
Subject: [gpfsug-discuss] GPFS Log Levels
Message-ID:

<CAHO5rBGDrHaFTTdJVcEsGG8qGtJF5ktREWuKjF-dk4R_qOOKUQ at mail.gmail.com>
Content-Type: text/plain; charset="utf-8"

Hi all,

I am trying to find more information on GPFS log levels. Here is what I
have so far:

[D] - Detail info
[I] - Info
[N] - Notice
[W] - Warning
[E] - Error
[X] - Critical Error
[A] - Deadlock related?

Any corrections or additional information would be greatly appreciated.

Thanks,
Josh
--
*Joshua D. Akers*

*HPC Systems Specialist*
NI&S Systems Support (MC0214)
1700 Pratt Drive
Blacksburg, VA 24061
540-231-9506
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <
http://gpfsug.org/pipermail/gpfsug-discuss/attachments/20161028/7ebac661/attachment.html
>

------------------------------

_______________________________________________
gpfsug-discuss mailing list
gpfsug-discuss at spectrumscale.org
http://gpfsug.org/mailman/listinfo/gpfsug-discuss


End of gpfsug-discuss Digest, Vol 57, Issue 76
**********************************************


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://gpfsug.org/pipermail/gpfsug-discuss_gpfsug.org/attachments/20161028/580cbad7/attachment-0002.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: ecblank.gif
Type: image/gif
Size: 45 bytes
Desc: not available
URL: <http://gpfsug.org/pipermail/gpfsug-discuss_gpfsug.org/attachments/20161028/580cbad7/attachment-0006.gif>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: 06388797.gif
Type: image/gif
Size: 1851 bytes
Desc: not available
URL: <http://gpfsug.org/pipermail/gpfsug-discuss_gpfsug.org/attachments/20161028/580cbad7/attachment-0007.gif>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: graycol.gif
Type: image/gif
Size: 105 bytes
Desc: not available
URL: <http://gpfsug.org/pipermail/gpfsug-discuss_gpfsug.org/attachments/20161028/580cbad7/attachment-0008.gif>


More information about the gpfsug-discuss mailing list