[gpfsug-discuss] GUI and pmsensors/pmcollector

Michael L Taylor taylorm at us.ibm.com
Fri Oct 28 15:56:30 BST 2016


Have a quick read of this link and see if it helps:
https://www.ibm.com/support/knowledgecenter/STXKQY_4.2.1/com.ibm.spectrum.scale.v4r21.doc/bl1ins_manualinstallofgui.htm







From:	gpfsug-discuss-request at spectrumscale.org
To:	gpfsug-discuss at spectrumscale.org
Date:	10/28/2016 07:23 AM
Subject:	gpfsug-discuss Digest, Vol 57, Issue 76
Sent by:	gpfsug-discuss-bounces at spectrumscale.org




Message: 1
Date: Fri, 28 Oct 2016 14:12:25 +0000
From: "Mark.Bush at siriuscom.com" <Mark.Bush at siriuscom.com>
To: gpfsug main discussion list <gpfsug-discuss at spectrumscale.org>
Subject: Re: [gpfsug-discuss] GUI and pmsensors/pmcollector
Message-ID: <66CEE4CD-BD01-4F50-839A-529062DC38D3 at siriuscom.com>
Content-Type: text/plain; charset="utf-8"

Perhaps I needed more description

I have a 3 node cluster
2 NDS?s (with tiebreaker)
1 gui node

The NDS?s all have pmsensors installed
The gui node had pmcollector installed

I?ve modified the /opt/IBM/zimon/ZIMSensors.cfg file on the NSD?s to point
to my gui node

Systemctl start pmsensors on NSD?s
Systemclt start pmcollector on gui
Systemctl start gpfsgui on gui

The log even shows connection from the two NSD?s  but for some reason the
GUI always has a red x in the performance graphs and claims it can?t
connect to the collector (which is running on the same node).

Not sure what I?m missing here.  It all works fine when it?s all on one
node.

Mark

From: <gpfsug-discuss-bounces at spectrumscale.org> on behalf of Stefan
Schmidt <stschmid at de.ibm.com>
Reply-To: gpfsug main discussion list <gpfsug-discuss at spectrumscale.org>
Date: Friday, October 28, 2016 at 8:59 AM
To: gpfsug main discussion list <gpfsug-discuss at spectrumscale.org>
Subject: Re: [gpfsug-discuss] GUI and pmsensors/pmcollector

Hi,

the PMCollector must be installed on the same node as the GUI. The
collectors can be installed on any node you want to monitor.
Hope that helps.

Mit freundlichen Gr??en / Kind regards

Stefan Schmidt

Scrum Master IBM Spectrum Scale GUI / Senior IT Architect/PMP?-Dept. M069 /
IBM Spectrum Scale Software Development

IBM Systems Group

IBM Deutschland

________________________________


Phone:

+49-703 4274 1966

 IBM Deutschland


 Am Weiher 24

E-Mail:

stschmid at de.ibm.com

 65421 Kelsterbach


 Germany

________________________________


IBM Deutschland Research & Development GmbH / Vorsitzende des
Aufsichtsrats: Martina Koederitz
Gesch?ftsf?hrung: Dirk Wittkopp
Sitz der Gesellschaft: B?blingen / Registergericht: Amtsgericht Stuttgart,
HRB 243294








From:        "Mark.Bush at siriuscom.com" <Mark.Bush at siriuscom.com>
To:        "gpfsug-discuss at spectrumscale.org"
<gpfsug-discuss at spectrumscale.org>
Date:        28.10.2016 15:53
Subject:        [gpfsug-discuss] GUI and pmsensors/pmcollector
Sent by:        gpfsug-discuss-bounces at spectrumscale.org

________________________________



I am able to do everything I need just fine when I have a single node
cluster and both pmsensors and pmcollector are installed all on the same
node.  When I try and get pmsensors on my NSD nodes and a separate
pmcollector elsewhere I never can get anything to show up in the graphs.
I?ve configured the pmsensors with mmperfmon config generate ?collectors=
(my node) and I still get nothing.  The logs show that I get connections
but the GUI never show and I?m unable to get mmperfmon query to work either
as it fails saying it can?t find the collector.

Anyone had this much trouble just getting this to work?



Mark

This message (including any attachments) is intended only for the use of
the individual or entity to which it is addressed and may contain
information that is non-public, proprietary, privileged, confidential, and
exempt from disclosure under applicable law. If you are not the intended
recipient, you are hereby notified that any use, dissemination,
distribution, or copying of this communication is strictly prohibited. This
message may be viewed by parties at Sirius Computer Solutions other than
those named in the message header. This message does not contain an
official representation of Sirius Computer Solutions. If you have received
this communication in error, notify Sirius Computer Solutions immediately
and (i) destroy this message if a facsimile or (ii) delete this message
immediately if this is an electronic communication. Thank you.

Sirius Computer Solutions<http://www.siriuscom.com/>
_______________________________________________
gpfsug-discuss mailing list
gpfsug-discuss at spectrumscale.org
http://gpfsug.org/mailman/listinfo/gpfsug-discuss


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <
http://gpfsug.org/pipermail/gpfsug-discuss/attachments/20161028/b4552796/attachment-0001.html
>

------------------------------

Message: 2
Date: Fri, 28 Oct 2016 14:22:49 +0000
From: Joshua Akers <akers at vt.edu>
To: gpfsug-discuss at spectrumscale.org
Subject: [gpfsug-discuss] GPFS Log Levels
Message-ID:

<CAHO5rBGDrHaFTTdJVcEsGG8qGtJF5ktREWuKjF-dk4R_qOOKUQ at mail.gmail.com>
Content-Type: text/plain; charset="utf-8"

Hi all,

I am trying to find more information on GPFS log levels. Here is what I
have so far:

[D] - Detail info
[I] - Info
[N] - Notice
[W] - Warning
[E] - Error
[X] - Critical Error
[A] - Deadlock related?

Any corrections or additional information would be greatly appreciated.

Thanks,
Josh
--
*Joshua D. Akers*

*HPC Systems Specialist*
NI&S Systems Support (MC0214)
1700 Pratt Drive
Blacksburg, VA 24061
540-231-9506
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <
http://gpfsug.org/pipermail/gpfsug-discuss/attachments/20161028/7ebac661/attachment.html
>

------------------------------

_______________________________________________
gpfsug-discuss mailing list
gpfsug-discuss at spectrumscale.org
http://gpfsug.org/mailman/listinfo/gpfsug-discuss


End of gpfsug-discuss Digest, Vol 57, Issue 76
**********************************************


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://gpfsug.org/pipermail/gpfsug-discuss_gpfsug.org/attachments/20161028/465509c2/attachment-0002.htm>


More information about the gpfsug-discuss mailing list