I’ve setup a linux VM in Azure. I’ve added incoming port access to the current listening port on Apache. I’ve also done a curl localhost on the VM and see the apache html text. I hit the public IP of the VM and get nothing. Any ideas? Answer According to your description, please check those settings: 1. Please check Azure
Tag: azure
Only blobs formatted as VHDs can be imported
I am following these https://learn.microsoft.com/en-us/azure/virtual-machines/linux/upload-vhd instructions to create a VM in Azure. At this stage though az disk create –resource-group myResourceGroup –name myManagedDisk –source https://mystorageaccount.blob.core.windows.net/mydisks/myDisk.vhd i get the following error Only blobs formatted as VHDs can be imported. I believe the VHD is a Linux OS. Any ideas to what i have missed or does someone have a link
How to get number of worker, cores, ram from HDI cluster
Here is my scenario. I am creating HDI cluster and installing my custom application using ARM template. I need following values to be configured for my application using Shell script. Installing my application using CustomScript option in ARM template. Number of worker nodes Number of cores per worker node RAM per worker node RAM per head node Number of cores
What is the best way to link two ARM template in Azure
HDI cluster on template2:- Azure VM on template1:- I am able to deploy 1 and 2 separately without any issues. Now I want to get the HDI cluster headnode IP and use in property file on Azure VM. How can I achieve 3 and deploy 1,2,3 in an order at one go? Answer According to your scenario, you could use
Azure VM Linux Ubuntu backup agent – Deployment failed
We have two Ubuntu (14.04) Linux server VMs on Azure hosting. Currently trying to setup the backup service within Azure for these machines. I have followed this guide and have the agent up and running. When running “waagent -version” in bash output shows: When then attempting to enable the backup service in Azure I get the error, deployment failed, here
how to download a sub directory from azure blob storage using blobxfer
I am using ubuntu 64bit 14.04. How can I use blobxfer to download a sub directory from Azure blob storage to my local machine . Say I have a storage account by name MY_STORAGE_ACCOUNT ,container by name MY_CONTAINER and following contents inside container. I want to download the sub directory directory3 only . I have a SAS key which has
How to access website on a Linux server in Azure? [closed]
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers. This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question
How to download SAS(Shared Acess signature) of an Azure File share onto Linux machine
I have a SAS generated for a file share(with read and list privileges(no write privileges). my SAS looks like the following format : “https://test.file.core.windows.net/testf1?[some_token_here] . I used Azcopy to download the files through above SAS onto a windows virtual machine however Azcopy is not present in Linux. How do I download the files using the above SAS onto my linux
How to find the FQDN of an Azure VM
I just spun up a new Azure VM from the portal. I want to find the FQDN of the said portal so that I can connect to it via SSH, using this scheme: ssh username@FQDN -p 22 Can anyone tell me where to look? I’ve spent many hours trying to do this, but to no avail. Answer For you to
azure command line tool for linux machine to upload/download data to/from Azure Storage
I am looking for azure command line tool in linux. I found azure-cli, but its not supporting upload and download feature in azure storage (As per my knowledge). I found some CLI tools but those are running only on windows. those are AzCopy cloudcopy So can anybody suggest one best azure CLI tool, which can run on linux machine and