[StarCluster] ssh Issue After Migrating AMI to New Region

Nolan Nichols bnniii at uw.edu
Wed Aug 10 02:46:07 EDT 2011


Hello,

I recently created a new AMI from the StarCluster AMI using the "starcluster
createimage" utility as described here:
http://web.mit.edu/stardev/cluster/docs/0.92rc2/manual/create_new_ami.html

The image created works as expected while running from an S3 bucket in the
us-east-1 region, after I migrated the AMI to an S3 bucket located in the
us-west-1 region via the "ec2-migrate-image" utility, there is an issue with
ssh. There is no problem starting up a single instance or a cluster, but ssh
doesn't seem to come online.

(test)nolan at nolan-mac:~ starcluster start ibicluster
...
>>> Waiting for cluster to come up... (updating every 30s)
>>> Waiting for instances to activate...
>>> Waiting for all nodes to be in a 'running' state...
2/2 ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
100%
>>> Waiting for SSH to come up on all nodes...
0/2 |                                                                  |
0%

I am using a clone of StarCluster on github, python epd 2.7.1, and
boto2.0b4. I tried the recommendation to remove al security groups starting
with '@ec.' suggested here: https://github.com/jtriley/StarCluster/issues/14,
but no luck yet.

Any ideas?

Thanks,

Nolan
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.mit.edu/pipermail/starcluster/attachments/20110809/0d803a34/attachment.htm


More information about the StarCluster mailing list